Combinatorial meets Convex Leveraging combinatorial structure to obtain faster algorithms, provably
|
|
- Curtis Chapman
- 5 years ago
- Views:
Transcription
1 Combinatorial meets Convex Leveraging combinatorial structure to obtain faster algorithms, provably Lorenzo Orecchia Department of Computer Science 6/30/15 Challenges in Optimization for Data Science 1
2 A Tale of Two Disciplines
3 A Tale of Two Disciplines Combinatorial Optimization Asks about paths, flows, cuts, clustering, routing Questions and solutions techniques tend to be discrete/combinatorial
4 A Tale of Two Disciplines Combinatorial Optimization Asks about paths, flows, cuts, clustering, routing Questions and solutions techniques tend to be discrete/combinatorial Numerical and Convex Optimization Asks about matrices, convex programs (e.g. LPs, SDPs) Questions and solution techniques tend to be continuous/numerical
5 A Tale of Two Disciplines NEW INSIGHT: Deep connections between core concepts Use techniques from one to help the other Combinatorial Optimization Asks about paths, flows, cuts, clustering, routing Questions and solutions techniques tend to be discrete/combinatorial Numerical and Convex Optimization Asks about matrices, convex programs (e.g. LPs, SDPs) Questions and solution techniques tend to be continuous/numerical
6 A Tale of Two Disciplines NEW INSIGHT: Deep connections between core concepts Use techniques from one to help the other Combinatorial Optimization Asks about paths, flows, cuts, clustering, routing Questions and solutions techniques tend to be discrete/combinatorial Numerical and Convex Optimization Asks about matrices, convex programs (e.g. LPs, SDPs) Fast Algorithms for Fundamental Graph Problems Questions and solution techniques tend to be continuous/numerical
7 Why Graph Algorithms? Computer Networks Social Networks Transportation Networks Representations of Physical Objects
8 Why Fast Graph Algorithms Classical Algorithms (1970s- 1990s): Standard notion of efficiency is polynomial running time Today: Graphs of interest are getting larger and larger Even quadratic running time is unfeasibly large for many instances
9 Why Fast Graph Algorithms
10 Why Fast Graph Algorithms Classical Algorithms (1970s- 1990s): Standard notion of efficiency is polynomial running time Today: Graphs of interest are getting larger and larger Even quadratic running time is unfeasibly large for many instances New Efficiency Requirement: as close as possible to linear NEARLY- LINEAR RUNNING TIME Input Size Running Time
11 Why Fast Graph Algorithms Classical Algorithms (1970s- 1990s): Standard notion of efficiency is polynomial running time Today: Graphs of interest are getting larger and larger Even quadratic running time is unfeasibly large for many instances New Efficiency Requirement: as close as possible to linear NEARLY- LINEAR RUNNING TIME Input Size Running Time
12 Nearly- Linear- Time Algorithms GOAL: Build library of primitives running in almost- linear time What can you do to a graph in nearly- linear time?
13 Nearly- Linear- Time Algorithms GOAL: Build library of primitives running in almost- linear time What can you do to a graph in nearly- linear time? Nearly- Linear Time REACHABILITY PROBLEM: Is there a path between vertices u and v in the graph G? u v
14 Nearly- Linear- Time Algorithms GOAL: Build library of primitives running in almost- linear time What can you do to a graph in nearly- linear time? Nearly- Linear Time Reachability REACHABILITY PROBLEM: Is there a path between vertices u and v in the graph G? YES, by depth- first search u v
15 Nearly- Linear- Time Algorithms GOAL: Build library of primitives running in almost- linear time What can you do to a graph in nearly- linear time? Nearly- Linear Time Reachability SHORTEST PATH PROBLEM: What is the shortest path from u to v? YES, by depth- first search v u
16 Nearly- Linear- Time Algorithms GOAL: Build library of primitives running in almost- linear time What can you do to a graph in nearly- linear time? Nearly- Linear Time Reachability Shortest Path SHORTEST PATH PROBLEM: What is the shortest path from u to v? YES, YES, by by breadth- first depth- first search v u
17 Nearly- Linear- Time Algorithms GOAL: Build library of primitives running in almost- linear time What can you do to a graph in nearly- linear time? Nearly- Linear Time Reachability Shortest Path Connectivity Minimum Cost Spanning Tree
18 Nearly- Linear- Time Algorithms GOAL: Build library of primitives running in almost- linear time What can you do to a graph in nearly- linear time? Nearly- Linear Time Reachability Shortest Path Connectivity Minimum Cost Spanning Tree Probe Graph Structure in Simple Way
19 Nearly- Linear- Time Algorithms What can you do to a graph in almost- linear time? Nearly- Linear Time Reachability, Shortest Path, Connectivity Minimum Cost Spanning Tree, ELECTRICAL FLOW PROBLEM: What is the current flow when one unit of current enters s and one leaves t? 1 1
20 Nearly- Linear- Time Algorithms What can you do to a graph in almost- linear time? Nearly- Linear Time Reachability, Shortest Path, Connectivity Minimum Cost Spanning Tree, Laplacian Systems of Linear Equations [Spielman, Teng 04] ELECTRICAL FLOW PROBLEM: What is the current flow when one unit of current enters s and one leaves t? 1 1
21 Nearly- Linear- Time Algorithms What can you do to a graph in almost- linear time? Nearly- Linear Time Reachability, Shortest Path, Connectivity Minimum Cost Spanning Tree, Laplacian Systems of Linear Equations [Spielman, Teng 04]
22 Nearly- Linear- Time Algorithms What can you do to a graph in almost- linear time? Nearly- Linear Time Reachability, Shortest Path, Connectivity Minimum Cost Spanning Tree, Laplacian Systems of Linear Equations [Spielman, Teng 04] Undirected Flow Problems s- t Maximum Flow [Kelner, Orecchia, Sidford, Lee 13][Sherman 13] Concurrent Multi- commodity Flow [Kelner, Orecchia, Sidford, Lee 13] Oblivious Routing [Kelner, Orecchia, Sidford, Lee 13] and Undirected Cut Problems Minimum s- t cut [Kelner, Orecchia, Sidford, Lee 13][Sherman 13] Approximate Sparsest Cut [Kelner, Orecchia, Sidford, Lee 13] [Sherman 13] Approximate Minimum Conductance Cut [Orecchia, Sachdeva, Vishnoi 12]
23 A Faster, Simpler Laplacian Solver + +
24 A Faster, Simpler Laplacian Solver + + INITIALIZATION: Choose a spanning tree T of G.
25 A Faster, Simpler Laplacian Solver + + INITIALIZATION: Choose a spanning tree T of G. Route flow along T to obtain initial flow f 0.
26 A Faster, Simpler Laplacian Solver INITIALIZATION: Choose a spanning tree T of G. Route flow along T to obtain initial flow f 0.
27 A Faster, Simpler Laplacian Solver
28 A Faster, Simpler Laplacian Solver INITIALIZATION: Choose a spanning tree T of G. Route flow along T to obtain initial flow f 0. NOTE: Flow f 0 is electrical flow if and only if Ohm s Law holds for all edges e=(b,a) Apply Ohm s Law to the spanning- tree edges to deduce voltages. 0
29 A Faster, Simpler Laplacian Solver INITIALIZATION: Choose a spanning tree T of G. Route flow along T to obtain initial flow f 0. NOTE: Flow f 0 is electrical flow if and only if Ohm s Law holds for all edges e=(b,a) Apply Ohm s Law to the spanning- tree edges to deduce voltages. 0
30 A Faster, Simpler Laplacian Solver INITIALIZATION: Choose a spanning tree T of G. Route flow along T to obtain initial flow f 0. NOTE: Flow f 0 is electrical flow if and only if Ohm s Law holds for all edges e=(b,a) Apply Ohm s Law to the spanning- tree edges to deduce voltages
31 Our Fastest, Simplest Laplacian Solver INITIALIZATION: Choose a spanning tree T of G. Route flow along T to obtain initial flow f 0. NOTE: Flow f 0 is electrical flow if and only if Ohm s Law holds for all edges e=(b,a) Apply Ohm s Law to the spanning- tree edges to deduce voltages. Check if voltages and flows obey Ohm s Law on off- tree edges. 0 24
32 Our Fastest, Simplest Laplacian Solver INITIALIZATION: Choose a spanning tree T of G. Route flow along T to obtain initial flow f 0. NOTE: Flow f 0 is electrical flow if and only if Ohm s Law holds for all edges e=(b,a) Apply Ohm s Law to the spanning- tree edges to deduce voltages. Check if voltages and flows obey Ohm s Law on off- tree edges. 0 24
33 Our Fastest, Simplest Laplacian Solver FAIL FAIL 0 0 INITIALIZATION: Choose a spanning tree T of G. Route flow along T to obtain initial flow f 0. NOTE: Flow f 0 is electrical flow if and only if Ohm s Law holds for all edges e=(b,a) Apply Ohm s Law to the spanning- tree edges to deduce voltages. Check if voltages and flows obey Ohm s Law on off- tree edges. 24
34 Our Fastest, Simplest Laplacian Solver
35 Our Fastest, Simplest Laplacian Solver INITIALIZATION: Choose a spanning tree T of G. Route flow along T to obtain initial flow f 0. MAIN LOOP (CYCLE FIXING): Apply Ohm s Law to the spanning- tree edges to deduce voltages. Check if voltages and flows obey Ohm s Law on off- tree edges. Consider cycle corresponding to failing off- tree edge e
36 Our Fastest, Simplest Laplacian Solver INITIALIZATION: Choose a spanning tree T of G. Route flow along T to obtain initial flow f 0. MAIN LOOP (CYCLE FIXING): Apply Ohm s Law to the spanning- tree edges to deduce voltages. Check if voltages and flows obey Ohm s Law on off- tree edges. Consider cycle corresponding to failing off- tree edge e Send flow around cycle until Ohm s Law is satisfied on e
37 Our Fastest, Simplest Laplacian Solver INITIALIZATION: Choose a spanning tree T of G. Route flow along T to obtain initial flow f 0. MAIN LOOP (CYCLE FIXING): Apply Ohm s Law to the spanning- tree edges to deduce voltages. Check if voltages and flows obey Ohm s Law on off- tree edges. Consider cycle corresponding to failing off- tree edge e Send flow around cycle until Ohm s Law is satisfied on e
38 Our Fastest, Simplest Laplacian Solver
39 Our Fastest, Simplest Laplacian Solver 20/ /3 + 4/3-4/3 0 4/3 4/3 INITIALIZATION: Choose a spanning tree T of G. Route flow along T to obtain initial flow f 0. MAIN LOOP (CYCLE FIXING): Apply Ohm s Law to the spanning- tree edges to deduce voltages. Check if voltages and flows obey Ohm s Law on off- tree edges. Consider cycle corresponding to failing off- tree edge e Send flow around cycle until Ohm s Law is satisfied on e Repeat
40 Our Fastest, Simplest Laplacian Solver 20/3-4/ /3-4/3 0 4/3 4/3 INITIALIZATION: Choose a spanning tree T of G. Route flow along T to obtain initial flow f 0. MAIN LOOP (CYCLE FIXING): Apply Ohm s Law to the spanning- tree edges to deduce voltages. Check if voltages and flows obey Ohm s Law on off- tree edges. Consider cycle corresponding to failing off- tree edge e Send flow around cycle until Ohm s Law is satisfied on e Repeat
41 Our Fastest, Simplest Laplacian Solver /3 92/3-4/3 76/ /3-4/3 0 /3 4/3 4/3 4/3 0/3 INITIALIZATION: Choose a spanning tree T of G. Route flow along T to obtain initial flow f 0. MAIN LOOP (CYCLE FIXING): Apply Ohm s Law to the spanning- tree edges to deduce voltages. Check if voltages and flows obey Ohm s Law on off- tree edges. Consider cycle corresponding to failing off- tree edge e Send flow around cycle until Ohm s Law is satisfied on e Repeat
42 Our Fastest, Simplest Laplacian Solver /3 92/3-4/3 76/ /3-4/3 0 /3 4/3 4/3 4/3 0/3 INITIALIZATION: Choose a spanning tree T of G. Route flow along T to obtain initial flow f 0. MAIN LOOP (CYCLE FIXING): Apply Ohm s Law to the spanning- tree edges to deduce voltages. Check if voltages and flows obey Ohm s Law on off- tree edges. Consider cycle corresponding to failing off- tree edge e Send flow around cycle until Ohm s Law is satisfied on e Repeat
43 Our Fastest, Simplest Laplacian Solver 92/3 + 20/3 4/3 24-4/3 OK - 4/3 /3 4/3 4/3 4/3 0/3 INITIALIZATION: Choose a spanning tree T of G. Route flow along T to obtain initial flow f 0. MAIN LOOP (CYCLE FIXING): Apply Ohm s Law to the spanning- tree edges to deduce voltages. Check if voltages and flows obey Ohm s Law on off- tree edges. Consider cycle corresponding to failing off- tree edge e Send flow around cycle until Ohm s Law is satisfied on e Repeat 16 76/3 FAIL 0 0 +
44 Our Fastest, Simplest Laplacian Solver 92/3 + 20/3 4/3 24-4/3 OK - 4/3 /3 4/3 4/3 4/3 0/3 INITIALIZATION: Choose a spanning tree T of G. Route flow along T to obtain initial flow f 0. MAIN LOOP (CYCLE FIXING): Apply Ohm s Law to the spanning- tree edges to deduce voltages. Check if voltages and flows obey Ohm s Law on off- tree edges. Consider cycle corresponding to failing off- tree edge e Send flow around cycle until Ohm s Law is satisfied on e Repeat 16 76/3 FAIL 0 0 +
45 Our Fastest, Simplest Laplacian Solver 20/3-4/ /3-4/3 0 4/3 4/3 INITIALIZATION: Choose a spanning tree T of G. Route flow along T to obtain initial flow f 0. MAIN LOOP (CYCLE FIXING): Apply Ohm s Law to the spanning- tree edges to deduce voltages. Check if voltages and flows obey Ohm s Law on off- tree edges. Consider cycle corresponding to failing off- tree edge e Send flow around cycle until Ohm s Law is satisfied on e Repeat
46 Working in the Space of Cycles START: Geometric interpretation of electrical flow algorithm Initial Flow f 0 f * Electrical Flow Subspace of flows routing required current input/output 16
47 Working in the Space of Cycles START: Geometric interpretation of electrical flow algorithm Initial Flow f 0 Cycle update on C 1 f 1 f * Electrical Flow Subspace of flows routing required current input/output NB: Our iterative solutions never leave this subspace thanks to cycle updates 16
48 Working in the Space of Cycles START: Geometric interpretation of electrical flow algorithm Initial Flow f 0 f 2 f 1 Cycle update on C 2 f * Electrical Flow Subspace of flows routing required current input/output NB: Our iterative solutions never leave this subspace thanks to cycle updates 16
49 Working in the Space of Cycles START: Geometric interpretation of electrical flow algorithm Initial Flow f 0 f 2 f 1 Cycle update on C 2 f * Electrical Flow Subspace of flows routing required current input/output NB: Our iterative solutions never leave this subspace thanks to cycle updates Linear combination of cycles 16
50 Coordinate Descent in the Space of Cycles GOAL: f 0 f * Electrical Flow 17
51 Coordinate Descent in the Space of Cycles GOAL: f 0 f * Electrical Flow Pick a basis of the space of cycles 17
52 Coordinate Descent in the Space of Cycles GOAL: f 1 f 0 f * Electrical Flow Pick a basis of the space of cycles Fix a coordinate at the time, i.e., coordinate descent 17
53 Coordinate Descent in the Space of Cycles GOAL: f 1 f 0 f 2 f * Electrical Flow Pick a basis of the space of cycles Fix a coordinate at the time, i.e., coordinate descent 17
54 Coordinate Descent in the Space of Cycles GOAL: f 1 f 0 f 2 f * Electrical Flow Pick a basis of the space of cycles Fix a coordinate at the time, i.e., coordinate descent 17
55 Coordinate Descent in the Space of Cycles GOAL: f 0 f * Electrical Flow EXAMPLE: Ill- conditioned basis yields slow convergence 1
56 Coordinate Descent in the Space of Cycles GOAL: f 0 f * Electrical Flow EXAMPLE: Ill- conditioned basis yields slow convergence 1
57 Low- Stretch Spanning Trees G
58 Low- Stretch Spanning Trees spanning tree T G
59 Low- Stretch Spanning Trees spanning tree T G e
60 Low- Stretch Spanning Trees spanning tree T G Stretch of e = Length of cycle C e st(e) = 5 e
61 Low- Stretch Spanning Trees spanning tree T G Stretch of e = Length of cycle C e e
62 Low- Stretch Spanning Trees spanning tree T G Stretch of e = Length of cycle C e e
63 Low- Stretch Spanning Trees spanning tree T G Stretch of e = Length of cycle C e e Average Stretch:
64 Low- Stretch Spanning Trees spanning tree T G Stretch of e = Length of cycle C e Average Stretch: Fact [Abraham, Neiman 12]: It is possible to compute a spanning tree with average stretch O(log n loglog n) in almost- linear time.
65 Low- Stretch Spanning Trees spanning tree T G Stretch of e = Length of cycle C e Average Stretch: Fact [Abraham, Neiman 12]: It is possible to compute a spanning tree with average stretch O(log n loglog n) in almost- linear time.
66 Electrical Flow: Algorithmic Components ITERATIVE METHOD Continuous Optimization: Randomized Coordinate Descent 24
67 Electrical Flow: Algorithmic Components ITERATIVE METHOD Continuous Optimization: Randomized Coordinate Descent PROBLEM REPRESENTATION Combinatorial Optimization: Space of cycles Basis given by Low- Stretch Spanning Tree 24
68 Electrical Flow: Algorithmic Components ITERATIVE METHOD Continuous Optimization: Randomized Coordinate Descent Joint Design of Components PROBLEM REPRESENTATION Combinatorial Optimization: Space of cycles Basis given by Low- Stretch Spanning Tree 24
69 A Framework for Algorithmic Design ITERATIVE METHOD Leverage Continuous Optimization Ideas Fast convergence requires smooth problem PROBLEM REPRESENTATION Not all representations are created equal: Use combinatorial techniques to produce a smooth representation
70 A Framework for Algorithmic Design ITERATIVE METHOD Leverage Continuous Optimization Ideas Fast convergence requires smooth problem PROBLEM REPRESENTATION Not all representations are created equal: Use combinatorial techniques to produce a smooth representation Efficiency and simplicity rely on combining these two components in the right way
71 Connection with Electrical Flow 1 s- t Maximum Flow Congestion Minimization
72 Connection with Electrical Flow Electrical Flow 1 s- t Maximum Flow Energy Minimization Congestion Minimization
73 Connection with Electrical Flow Electrical Flow 1 s- t Maximum Flow Set resistances as:
74 Objective function: An Extra Difficulty
75 An Extra Difficulty Objective function: PROBLEM: OBJECTIVE IS EXTREMELY NON- SMOOTH
76 An Extra Difficulty Objective function: PROBLEM: OBJECTIVE IS EXTREMELY NON- SMOOTH
77 An Extra Difficulty Objective function: PROBLEM: OBJECTIVE IS EXTREMELY NON- SMOOTH
78 An Extra Difficulty Objective function: PROBLEM: OBJECTIVE IS EXTREMELY NON- SMOOTH
79 An Extra Difficulty Objective function: PROBLEM: OBJECTIVE IS EXTREMELY NON- SMOOTH
80 An Extra Step: Regularization Objective function:
81 An Extra Step: Regularization Objective function: SOLUTION: Change objective Find function that is close to objective but somewhat smooth
82 An Extra Step: Regularization Objective function: SOLUTION: Change objective Find function that is close to objective but somewhat smooth
83 An Extra Step: Regularization Objective function: PROBLEM: OBJECTIVE IS EXTREMELY NON- SMOOTH SOLUTION: Change objective Find function that is close to objective but somewhat smooth
84 Applying the Framework: Comparison Electrical Flow s- t Maximum Flow OBJECTIVE No regularization needed PROBLEM REPRESENTATION Regularize to softmax ITERATIVE METHOD
85 Applying the Framework: Comparison Electrical Flow s- t Maximum Flow OBJECTIVE No regularization needed Use basis given by Low- stretch spanning tree PROBLEM REPRESENTATION Regularize to softmax ITERATIVE METHOD
86 Applying the Framework: Comparison Electrical Flow s- t Maximum Flow OBJECTIVE No regularization needed Use basis given by Low- stretch spanning tree PROBLEM REPRESENTATION Regularize to softmax Which basis to use? Little interference in ITERATIVE METHOD
87 Applying the Framework: Comparison Electrical Flow s- t Maximum Flow OBJECTIVE No regularization needed Use basis given by Low- stretch spanning tree PROBLEM REPRESENTATION Regularize to softmax Which basis to use? Little interference in Surprising equivalence: Basis is OBLIVIOUS ROUTING SCHEME ITERATIVE METHOD
88 Applying the Framework: Comparison Electrical Flow s- t Maximum Flow OBJECTIVE No regularization needed Use basis given by Low- stretch spanning tree PROBLEM REPRESENTATION Regularize to softmax Which basis to use? Little interference in Surprising equivalence: Basis is OBLIVIOUS ROUTING SCHEME ITERATIVE METHOD
89 Oblivious Routing
90 Oblivious Routing GOAL: Route traffic between many pairs of users on the Internet Minimize maximum congestion of a link
91 Oblivious Routing GOAL: Route traffic between many pairs of users on the Internet Minimize maximum congestion of a link
92 Oblivious Routing GOAL: Route traffic between many pairs of users on the Internet Minimize maximum congestion of a link
93 Oblivious Routing GOAL: Route traffic between many pairs of users on the Internet Minimize maximum congestion of a link Routing = Probability Distribution over Paths = Flow
94 Oblivious Routing GOAL: Route traffic between many pairs of users on the Internet Minimize maximum congestion of a link Routing = Probability Distribution over Paths = Flow
95 Oblivious Routing GOAL: Route traffic between many pairs of users on the Internet Minimize maximum congestion of a link Routing = Probability Distribution over Paths = Flow
96 Oblivious Routing GOAL: Route traffic between many pairs of users on the Internet Minimize maximum congestion of a link Routing = Probability Distribution over Paths = Flow
97 Oblivious Routing GOAL: Route traffic between many pairs of users on the Internet Minimize maximum congestion of a link DIFFICULTY: Requests arrive online in arbitrary order How to avoid global flow computation at every new arrival
98 Oblivious Routing GOAL: Route traffic between many pairs of users on the Internet Minimize maximum congestion of a link DIFFICULTY: Requests arrive online in arbitrary order How to avoid global flow computation at every new arrival SOLUTION: Oblivious Routing: Every request is routed obliviously of the other requests
99 Oblivious Routing GOAL: Route traffic between many pairs of users on the Internet Minimize maximum congestion of a link DIFFICULTY: Requests arrive online in arbitrary order How to avoid global flow computation at every new arrival SOLUTION: Oblivious Routing: Every request is routed obliviously of the other requests
100 Oblivious Routing GOAL: Route traffic between many pairs of users on the Internet Minimize maximum congestion of a link DIFFICULTY: Requests arrive online in arbitrary order How to avoid global flow computation at every new arrival SOLUTION: Oblivious Routing: Every request is routed obliviously of the other requests PRE- PREPROCESSING: Routes are pre- computed
101 Oblivious Routing GOAL: Route traffic between many pairs of users on the Internet Minimize maximum congestion of a link DIFFICULTY: Requests arrive online in arbitrary order How to avoid global flow computation at every new arrival SOLUTION: Oblivious Routing: Every request is routed obliviously of the other requests PRE- PREPROCESSING: Routes are pre- computed MEASURE OF PERFORMANCE: Worst- case ratio between congestion of oblivious- routing and optimal a posteriori routing COMPETITIVE RATIO
102 Oblivious Routing: A New Scheme GOAL: Route traffic between many pairs of users on the Internet Minimize maximum congestion of a link DIFFICULTY: Requests arrive online in arbitrary order How to avoid global flow computation at every new arrival SOLUTION: Oblivious Routing: Every request is routed obliviously of the other requests PRE- PREPROCESSING: Routes are pre- computed MEASURE OF PERFORMANCE: Worst- case ratio between congestion of oblivious- routing and optimal a posteriori routing COMPETITIVE RATIO
103 Oblivious Routing: A New Scheme GOAL: Route traffic between many pairs of users on the Internet Minimize maximum congestion of a link DIFFICULTY: Requests arrive online in arbitrary order How to avoid global flow computation at every new arrival SOLUTION: Oblivious Routing: Every request is routed obliviously of the other requests PRE- PREPROCESSING: Routes are pre- computed NEARLY- LINEAR RUNNING TIME MEASURE OF PERFORMANCE: Worst- case ratio between congestion of oblivious- routing and optimal a posteriori routing SUBLINEAR COMPETITIVE RATIO
104 Applying the Framework: Comparison Electrical Flow s- t Maximum Flow OBJECTIVE No regularization needed Use basis given by low- stretch spanning tree PROBLEM REPRESENTATION Regularize to softmax Use basis given by oblivious routing scheme ITERATIVE METHOD Non- Euclidean Gradient Descent
105 Applying the Framework: Comparison Electrical Flow s- t Maximum Flow OBJECTIVE No regularization needed Use basis given by low- stretch spanning tree PROBLEM REPRESENTATION Regularize to softmax Use basis given by oblivious routing scheme Coordinate Descent ITERATIVE METHOD? Non- Euclidean Gradient Descent
106 Euclidean Gradient Descent Contour map of over feasible subspace
107 Euclidean Gradient Descent Contour map of over feasible subspace
108 Euclidean Gradient Descent Contour map of over feasible subspace
109 Euclidean Gradient Descent Contour map of over feasible subspace
110 Euclidean Gradient Descent Contour map of over feasible subspace
111 Euclidean Gradient Descent Contour map of over feasible subspace
112 Euclidean Gradient Descent Contour map of over feasible subspace
113 Euclidean Gradient Descent Contour map of over feasible subspace
114 Euclidean Gradient Descent Contour map of over feasible subspace
115 Euclidean Gradient Descent Contour map of over feasible subspace
116 Non- Euclidean Gradient Descent Contour map of over feasible subspace
117 Non- Euclidean Gradient Descent Contour map of over feasible subspace
118 Non- Euclidean Gradient Descent Contour map of over feasible subspace
119 Non- Euclidean Gradient Descent Contour map of over feasible subspace
120 Non- Euclidean Gradient Descent Contour map of over feasible subspace
121 Non- Euclidean Gradient Descent Contour map of over feasible subspace
122 Non- Euclidean Gradient Descent Contour map of over feasible subspace
123 Applying the Framework: Comparison Electrical Flow s- t Maximum Flow OBJECTIVE No regularization needed Use basis given by low- stretch spanning tree PROBLEM REPRESENTATION Regularize to softmax Use basis given by oblivious routing scheme Coordinate Descent ITERATIVE METHOD? Non- Euclidean Gradient Descent
124 Applying the Framework: Comparison Electrical Flow s- t Maximum Flow OBJECTIVE No regularization needed Use basis given by low- stretch spanning tree PROBLEM REPRESENTATION Regularize to softmax Use basis given by oblivious routing scheme Coordinate Descent ITERATIVE METHOD? NEARLY- LINEAR- TIME FOR BOTH PROBLEMS Non- Euclidean Gradient Descent
125 Where Do We Go From Here? Future Directions and Open Problems
126 A New Algorithmic Approach A novel design framework for fast graph algorithms Incorporates and leverages idea from multiple fields Based on radically different approach Has yielded conceptually simple, powerful algorithms Combinatorial Optimization Combinatorial insight plays a crucial role - Low- stretch spanning trees - Oblivious routings Numerous potential applications in Algorithms and other fields PROBLEM REPRESENTATION Numerical Scientific Computing ITERATIVE METHOD
127 Limits of Nearly- Linear Time? Almost- Linear Time Reachability Shortest Path Connectivity Minimum Cost Spanning Tree Super- linear Time Directed Flow Problems Directed Cut Problems All- pair Shortest Path Network Design Laplacian Systems of Linear Equations Undirected Flow Problems: s- t Maximum Flow Concurrent Multi- commodity Flow Oblivious Routing Undirected Cut Problems: Minimum s- t cut Approximate Sparsest Cut Approximate Minimum Conductance Cut
128 Limits of Nearly- Linear Time? Almost- Linear Time Reachability Shortest Path Connectivity Minimum Cost Spanning Tree Super- linear Time Directed Flow Problems Directed Cut Problems All- pair Shortest Path Network Design Laplacian Systems of Linear Equations Undirected Flow Problems: s- t Maximum Flow Concurrent Multi- commodity Flow Oblivious Routing Undirected Cut Problems: Minimum s- t cut Approximate Sparsest Cut Approximate Minimum Conductance Cut
129 Limits of Nearly- Linear Time? Almost- Linear Time Super- linear Time Reachability Shortest Path Connectivity Minimum Cost Spanning Tree Laplacian Systems of Linear Equations? Directed Flow Problems Directed Cut Problems All- pair Shortest Path Network Design Undirected Flow Problems: s- t Maximum Flow Concurrent Multi- commodity Flow Oblivious Routing Undirected Cut Problems: Minimum s- t cut Approximate Sparsest Cut Approximate Minimum Conductance Cut
130 Limits of Nearly- Linear Time? Almost- Linear Time Super- linear Time Reachability Shortest Path Connectivity Minimum Cost Spanning Tree? Directed Flow Problems Directed Cut Problems All- pair Shortest Path Network Design Laplacian Systems of Linear Equations Undirected Flow Problems: s- t Maximum Flow Concurrent Multi- commodity Flow Oblivious Routing Undirected Cut Problems: Minimum s- t cut Approximate Sparsest Cut Approximate Minimum Conductance Cut Recent Partial Progress: - Improved running time for directed flow problems [Madry 13] - Conditional lowerbounds for All- Pair Shortest Path [Williams 13]
131 Recent Developments Beyond Graph Algorithms Interpretation of Nesterov s Acceleration as Gradient + Mirror Descent [Allen- Zhu, O 14] APPLICATIONS: Faster Approximate Solver for Packing Linear Programs [Allen- Zhu, O 15] Faster Parallel Algorithms for Positive Linear Programs [Allen- Zhu, O 14] Coordinate Descent on Non- smooth Objectives to Construct Sparse Objects APPLICATIONS: Fast Graph and Matrix Sparsification [Allen- Zhu, Liao, O 15] Many more from computational complexity to quantum computing 7/2/15 Challenges in Optimization for Data Science 42
Parallel Graph Algorithms. Richard Peng Georgia Tech
Parallel Graph Algorithms Richard Peng Georgia Tech OUTLINE Model and problems Graph decompositions Randomized clusterings Interface with optimization THE MODEL The `scale up approach: Have a number of
More informationLaplacian Paradigm 2.0
Laplacian Paradigm 2.0 8:40-9:10: Merging Continuous and Discrete(Richard Peng) 9:10-9:50: Beyond Laplacian Solvers (Aaron Sidford) 9:50-10:30: Approximate Gaussian Elimination (Sushant Sachdeva) 10:30-11:00:
More informationSpectral Graph Sparsification: overview of theory and practical methods. Yiannis Koutis. University of Puerto Rico - Rio Piedras
Spectral Graph Sparsification: overview of theory and practical methods Yiannis Koutis University of Puerto Rico - Rio Piedras Graph Sparsification or Sketching Compute a smaller graph that preserves some
More informationSDD Solvers: Bridging theory and practice
Yiannis Koutis University of Puerto Rico, Rio Piedras joint with Gary Miller, Richard Peng Carnegie Mellon University SDD Solvers: Bridging theory and practice The problem: Solving very large Laplacian
More informationSolvers for Linear Systems in Graph Laplacians
CS7540 Spectral Algorithms, Spring 2017 Lectures #22-#24 Solvers for Linear Systems in Graph Laplacians Presenter: Richard Peng Mar, 2017 These notes are an attempt at summarizing the rather large literature
More informationThe Revolution in Graph Theoretic Optimization Problem
The Revolution in Graph Theoretic Optimization Problem Gary L Miller Simons Open Lecture Oct 6, 2014 JOINT WORK Guy Blelloch, Hui Han Chin, Michael Cohen, Anupam Gupta, Jonathan Kelner, Yiannis Koutis,
More informationGraph Decomposi.ons and Unique Games
Graph Decomposi.ons and Unique Games Lap Chi Lau University of Waterloo Two Results Approxima.ng unique games using low diameter graph decomposi.on, joint work with Vedat Levi Alev Graph clustering using
More informationLECTURE 13: SOLUTION METHODS FOR CONSTRAINED OPTIMIZATION. 1. Primal approach 2. Penalty and barrier methods 3. Dual approach 4. Primal-dual approach
LECTURE 13: SOLUTION METHODS FOR CONSTRAINED OPTIMIZATION 1. Primal approach 2. Penalty and barrier methods 3. Dual approach 4. Primal-dual approach Basic approaches I. Primal Approach - Feasible Direction
More informationLecture 25: Element-wise Sampling of Graphs and Linear Equation Solving, Cont. 25 Element-wise Sampling of Graphs and Linear Equation Solving,
Stat260/CS294: Randomized Algorithms for Matrices and Data Lecture 25-12/04/2013 Lecture 25: Element-wise Sampling of Graphs and Linear Equation Solving, Cont. Lecturer: Michael Mahoney Scribe: Michael
More informationLinear Optimization and Extensions: Theory and Algorithms
AT&T Linear Optimization and Extensions: Theory and Algorithms Shu-Cherng Fang North Carolina State University Sarai Puthenpura AT&T Bell Labs Prentice Hall, Englewood Cliffs, New Jersey 07632 Contents
More informationA Parallel Solver for Laplacian Matrices. Tristan Konolige (me) and Jed Brown
A Parallel Solver for Laplacian Matrices Tristan Konolige (me) and Jed Brown Graph Laplacian Matrices Covered by other speakers (hopefully) Useful in a variety of areas Graphs are getting very big Facebook
More informationCS 5220: Parallel Graph Algorithms. David Bindel
CS 5220: Parallel Graph Algorithms David Bindel 2017-11-14 1 Graphs Mathematically: G = (V, E) where E V V Convention: V = n and E = m May be directed or undirected May have weights w V : V R or w E :
More informationA Brief Look at Optimization
A Brief Look at Optimization CSC 412/2506 Tutorial David Madras January 18, 2018 Slides adapted from last year s version Overview Introduction Classes of optimization problems Linear programming Steepest
More informationSpeeding up MWU based Approximation Schemes and Some Applications
Speeding up MWU based Approximation Schemes and Some Applications Chandra Chekuri Univ. of Illinois, Urbana-Champaign Joint work with Kent Quanrud Based on recent papers Near-linear-time approximation
More informationCOMPUTATIONAL INTELLIGENCE (INTRODUCTION TO MACHINE LEARNING) SS18. Lecture 2: Linear Regression Gradient Descent Non-linear basis functions
COMPUTATIONAL INTELLIGENCE (INTRODUCTION TO MACHINE LEARNING) SS18 Lecture 2: Linear Regression Gradient Descent Non-linear basis functions LINEAR REGRESSION MOTIVATION Why Linear Regression? Simplest
More informationModule 1 Lecture Notes 2. Optimization Problem and Model Formulation
Optimization Methods: Introduction and Basic concepts 1 Module 1 Lecture Notes 2 Optimization Problem and Model Formulation Introduction In the previous lecture we studied the evolution of optimization
More informationCombinatorial Selection and Least Absolute Shrinkage via The CLASH Operator
Combinatorial Selection and Least Absolute Shrinkage via The CLASH Operator Volkan Cevher Laboratory for Information and Inference Systems LIONS / EPFL http://lions.epfl.ch & Idiap Research Institute joint
More informationContents. I Basics 1. Copyright by SIAM. Unauthorized reproduction of this article is prohibited.
page v Preface xiii I Basics 1 1 Optimization Models 3 1.1 Introduction... 3 1.2 Optimization: An Informal Introduction... 4 1.3 Linear Equations... 7 1.4 Linear Optimization... 10 Exercises... 12 1.5
More informationLECTURE NOTES Non-Linear Programming
CEE 6110 David Rosenberg p. 1 Learning Objectives LECTURE NOTES Non-Linear Programming 1. Write out the non-linear model formulation 2. Describe the difficulties of solving a non-linear programming model
More informationPreconditioning Linear Systems Arising from Graph Laplacians of Complex Networks
Preconditioning Linear Systems Arising from Graph Laplacians of Complex Networks Kevin Deweese 1 Erik Boman 2 1 Department of Computer Science University of California, Santa Barbara 2 Scalable Algorithms
More informationOptimization. Industrial AI Lab.
Optimization Industrial AI Lab. Optimization An important tool in 1) Engineering problem solving and 2) Decision science People optimize Nature optimizes 2 Optimization People optimize (source: http://nautil.us/blog/to-save-drowning-people-ask-yourself-what-would-light-do)
More informationDavid G. Luenberger Yinyu Ye. Linear and Nonlinear. Programming. Fourth Edition. ö Springer
David G. Luenberger Yinyu Ye Linear and Nonlinear Programming Fourth Edition ö Springer Contents 1 Introduction 1 1.1 Optimization 1 1.2 Types of Problems 2 1.3 Size of Problems 5 1.4 Iterative Algorithms
More informationarxiv: v1 [cs.ds] 9 Sep 2016
An Empirical Study of Cycle Toggling Based Laplacian Solvers arxiv:1609.0957v1 [cs.ds] 9 Sep 016 Abstract Kevin Deweese UCSB kdeweese@cs.ucsb.edu Richard Peng Georgia Tech rpeng@cc.gatech.edu John R. Gilbert
More informationCAIMS June 4-7, 2018, Ryerson University, Toronto, Ontario, Canada. Scientific Theme: Discrete Optimization and Applications
CAIMS 2018 June 4-7, 2018, Ryerson University, Toronto, Ontario, Canada Scientific Theme: Discrete Optimization and Applications Organizer: Konstantinos Georgiou, Ryerson University, Canada Plenary Speaker:
More informationCOMPUTATIONAL INTELLIGENCE (CS) (INTRODUCTION TO MACHINE LEARNING) SS16. Lecture 2: Linear Regression Gradient Descent Non-linear basis functions
COMPUTATIONAL INTELLIGENCE (CS) (INTRODUCTION TO MACHINE LEARNING) SS16 Lecture 2: Linear Regression Gradient Descent Non-linear basis functions LINEAR REGRESSION MOTIVATION Why Linear Regression? Regression
More informationOnline Algorithms. Lecture 11
Online Algorithms Lecture 11 Today DC on trees DC on arbitrary metrics DC on circle Scheduling K-server on trees Theorem The DC Algorithm is k-competitive for the k server problem on arbitrary tree metrics.
More informationSAMG: Sparsified Graph-Theoretic Algebraic Multigrid for Solving Large Symmetric Diagonally Dominant (SDD) Matrices
SAMG: Sparsified Graph-Theoretic Algebraic Multigrid for Solving Large Symmetric Diagonally Dominant (SDD) Matrices Zhiqiang Zhao Department of ECE Michigan Technological University Houghton, Michigan
More informationParallel Local Graph Clustering
Parallel Local Graph Clustering Kimon Fountoulakis, joint work with J. Shun, X. Cheng, F. Roosta-Khorasani, M. Mahoney, D. Gleich University of California Berkeley and Purdue University Based on J. Shun,
More informationSMORE: Semi-Oblivious Traffic Engineering
SMORE: Semi-Oblivious Traffic Engineering Praveen Kumar * Yang Yuan* Chris Yu Nate Foster* Robert Kleinberg* Petr Lapukhov # Chiun Lin Lim # Robert Soulé * Cornell CMU # Facebook USI Lugano WAN Traffic
More informationIntroduction to Mathematical Programming IE496. Final Review. Dr. Ted Ralphs
Introduction to Mathematical Programming IE496 Final Review Dr. Ted Ralphs IE496 Final Review 1 Course Wrap-up: Chapter 2 In the introduction, we discussed the general framework of mathematical modeling
More informationCS675: Convex and Combinatorial Optimization Spring 2018 The Simplex Algorithm. Instructor: Shaddin Dughmi
CS675: Convex and Combinatorial Optimization Spring 2018 The Simplex Algorithm Instructor: Shaddin Dughmi Algorithms for Convex Optimization We will look at 2 algorithms in detail: Simplex and Ellipsoid.
More information16.410/413 Principles of Autonomy and Decision Making
16.410/413 Principles of Autonomy and Decision Making Lecture 17: The Simplex Method Emilio Frazzoli Aeronautics and Astronautics Massachusetts Institute of Technology November 10, 2010 Frazzoli (MIT)
More informationSublinear Algorithms for Big Data Analysis
Sublinear Algorithms for Big Data Analysis Michael Kapralov Theory of Computation Lab 4 EPFL 7 September 2017 The age of big data: massive amounts of data collected in various areas of science and technology
More informationEnsemble methods in machine learning. Example. Neural networks. Neural networks
Ensemble methods in machine learning Bootstrap aggregating (bagging) train an ensemble of models based on randomly resampled versions of the training set, then take a majority vote Example What if you
More informationProgramming aspects in a combinatorial sparse linear solver
U.U.D.M. Project Report 2014:2 Programming aspects in a combinatorial sparse linear solver Aron Rindstedt Examensarbete i matematik, 15 hp Handledare: Dimitar Lukarski, Institutionen för informationsteknologi
More informationConvex Optimization CMU-10725
Convex Optimization CMU-10725 Conjugate Direction Methods Barnabás Póczos & Ryan Tibshirani Conjugate Direction Methods 2 Books to Read David G. Luenberger, Yinyu Ye: Linear and Nonlinear Programming Nesterov:
More informationAlina Ene. From Minimum Cut to Submodular Minimization Leveraging the Decomposable Structure. Boston University
From Minimum Cut to Submodular Minimization Leveraging the Decomposable Structure Alina Ene Boston University Joint work with Huy Nguyen (Northeastern University) Laszlo Vegh (London School of Economics)
More informationClassical Gradient Methods
Classical Gradient Methods Note simultaneous course at AMSI (math) summer school: Nonlin. Optimization Methods (see http://wwwmaths.anu.edu.au/events/amsiss05/) Recommended textbook (Springer Verlag, 1999):
More informationConvex Optimization. Lijun Zhang Modification of
Convex Optimization Lijun Zhang zlj@nju.edu.cn http://cs.nju.edu.cn/zlj Modification of http://stanford.edu/~boyd/cvxbook/bv_cvxslides.pdf Outline Introduction Convex Sets & Functions Convex Optimization
More informationAlgebraic Iterative Methods for Computed Tomography
Algebraic Iterative Methods for Computed Tomography Per Christian Hansen DTU Compute Department of Applied Mathematics and Computer Science Technical University of Denmark Per Christian Hansen Algebraic
More informationAlgorithms for convex optimization
Algorithms for convex optimization Michal Kočvara Institute of Information Theory and Automation Academy of Sciences of the Czech Republic and Czech Technical University kocvara@utia.cas.cz http://www.utia.cas.cz/kocvara
More informationarxiv: v1 [cs.ds] 1 Jan 2015
Fast Generation of Random Spanning Trees and the Effective Resistance Metric Aleksander Madry EPFL Damian Straszak University of Wroc law Jakub Tarnawski University of Wroc law arxiv:1501.00267v1 [cs.ds]
More informationLecture 11: Randomized Least-squares Approximation in Practice. 11 Randomized Least-squares Approximation in Practice
Stat60/CS94: Randomized Algorithms for Matrices and Data Lecture 11-10/09/013 Lecture 11: Randomized Least-squares Approximation in Practice Lecturer: Michael Mahoney Scribe: Michael Mahoney Warning: these
More informationFoundations of Computing
Foundations of Computing Darmstadt University of Technology Dept. Computer Science Winter Term 2005 / 2006 Copyright c 2004 by Matthias Müller-Hannemann and Karsten Weihe All rights reserved http://www.algo.informatik.tu-darmstadt.de/
More informationRecent Developments in Model-based Derivative-free Optimization
Recent Developments in Model-based Derivative-free Optimization Seppo Pulkkinen April 23, 2010 Introduction Problem definition The problem we are considering is a nonlinear optimization problem with constraints:
More informationComposite Self-concordant Minimization
Composite Self-concordant Minimization Volkan Cevher Laboratory for Information and Inference Systems-LIONS Ecole Polytechnique Federale de Lausanne (EPFL) volkan.cevher@epfl.ch Paris 6 Dec 11, 2013 joint
More informationMVE165/MMG630, Applied Optimization Lecture 8 Integer linear programming algorithms. Ann-Brith Strömberg
MVE165/MMG630, Integer linear programming algorithms Ann-Brith Strömberg 2009 04 15 Methods for ILP: Overview (Ch. 14.1) Enumeration Implicit enumeration: Branch and bound Relaxations Decomposition methods:
More informationCPSC 340: Machine Learning and Data Mining. Principal Component Analysis Fall 2016
CPSC 340: Machine Learning and Data Mining Principal Component Analysis Fall 2016 A2/Midterm: Admin Grades/solutions will be posted after class. Assignment 4: Posted, due November 14. Extra office hours:
More informationClustering for Faster Network Simplex Pivots
Clustering for Faster Network Simplex Pivots David Eppstein Department of Information and Computer Science University of California, Irvine, CA 92717 Tech. Report 93-19 April 15, 1993 Abstract We show
More informationarxiv: v3 [cs.ds] 6 Oct 2015
Evaluating the Potential of a Dual Randomized Kaczmarz Solver for Laplacian Linear Systems Erik G. Boman Sandia National Laboratories Center for Computing Research egboman@sandia.gov Kevin Deweese UC Santa
More informationGraphs and Network Flows IE411. Lecture 21. Dr. Ted Ralphs
Graphs and Network Flows IE411 Lecture 21 Dr. Ted Ralphs IE411 Lecture 21 1 Combinatorial Optimization and Network Flows In general, most combinatorial optimization and integer programming problems are
More informationSolving basis pursuit: infeasible-point subgradient algorithm, computational comparison, and improvements
Solving basis pursuit: infeasible-point subgradient algorithm, computational comparison, and improvements Andreas Tillmann ( joint work with D. Lorenz and M. Pfetsch ) Technische Universität Braunschweig
More informationSupplementary Materials for
advances.sciencemag.org/cgi/content/full/4/1/eaao7005/dc1 Supplementary Materials for Computational discovery of extremal microstructure families The PDF file includes: Desai Chen, Mélina Skouras, Bo Zhu,
More informationk-means, k-means++ Barna Saha March 8, 2016
k-means, k-means++ Barna Saha March 8, 2016 K-Means: The Most Popular Clustering Algorithm k-means clustering problem is one of the oldest and most important problem. K-Means: The Most Popular Clustering
More informationCS 395T Lecture 12: Feature Matching and Bundle Adjustment. Qixing Huang October 10 st 2018
CS 395T Lecture 12: Feature Matching and Bundle Adjustment Qixing Huang October 10 st 2018 Lecture Overview Dense Feature Correspondences Bundle Adjustment in Structure-from-Motion Image Matching Algorithm
More informationAlgorithms and Data Structures
Algorithm Analysis Page 1 - Algorithm Analysis Dr. Fall 2008 Algorithm Analysis Page 2 Outline Textbook Overview Analysis of Algorithm Pseudo-Code and Primitive Operations Growth Rate and Big-Oh Notation
More informationFrom Routing to Traffic Engineering
1 From Routing to Traffic Engineering Robert Soulé Advanced Networking Fall 2016 2 In the beginning B Goal: pair-wise connectivity (get packets from A to B) Approach: configure static rules in routers
More informationCS 435, 2018 Lecture 2, Date: 1 March 2018 Instructor: Nisheeth Vishnoi. Convex Programming and Efficiency
CS 435, 2018 Lecture 2, Date: 1 March 2018 Instructor: Nisheeth Vishnoi Convex Programming and Efficiency In this lecture, we formalize convex programming problem, discuss what it means to solve it efficiently
More informationExponential Start Time Clustering and its Applications in Spectral Graph Theory. Shen Chen Xu
Exponential Start Time Clustering and its Applications in Spectral Graph Theory Shen Chen Xu CMU-CS-17-120 August 2017 School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213 Thesis
More information291 Programming Assignment #3
000 001 002 003 004 005 006 007 008 009 010 011 012 013 014 015 016 017 018 019 020 021 022 023 024 025 026 027 028 029 030 031 032 033 034 035 036 037 038 039 040 041 042 043 044 045 046 047 048 049 050
More informationCopyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch.
Iterative Improvement Algorithm design technique for solving optimization problems Start with a feasible solution Repeat the following step until no improvement can be found: change the current feasible
More informationComputing Heat Kernel Pagerank and a Local Clustering Algorithm
Computing Heat Kernel Pagerank and a Local Clustering Algorithm Fan Chung and Olivia Simpson University of California, San Diego La Jolla, CA 92093 {fan,osimpson}@ucsd.edu Abstract. Heat kernel pagerank
More informationALGORITHMS FOR DICTIONARY LEARNING
ALGORITHMS FOR DICTIONARY LEARNING ANKUR MOITRA MASSACHUSETTS INSTITUTE OF TECHNOLOGY SPARSE REPRESENTATIONS SPARSE REPRESENTATIONS Many data- types are sparse in an appropriately chosen basis: SPARSE
More informationEfficient and practical tree preconditioning for solving Laplacian systems
Efficient and practical tree preconditioning for solving Laplacian systems Luca Castelli Aleardi, Alexandre Nolin, Maks Ovsjanikov To cite this version: Luca Castelli Aleardi, Alexandre Nolin, Maks Ovsjanikov.
More informationSPECTRAL SPARSIFICATION IN SPECTRAL CLUSTERING
SPECTRAL SPARSIFICATION IN SPECTRAL CLUSTERING Alireza Chakeri, Hamidreza Farhidzadeh, Lawrence O. Hall Department of Computer Science and Engineering College of Engineering University of South Florida
More informationThe Design of Approximation Algorithms
The Design of Approximation Algorithms David P. Williamson Cornell University David B. Shmoys Cornell University m Щ0 CAMBRIDGE UNIVERSITY PRESS Contents Preface page ix I An Introduction to the Techniques
More informationParallel Computation of Spherical Parameterizations for Mesh Analysis. Th. Athanasiadis and I. Fudos University of Ioannina, Greece
Parallel Computation of Spherical Parameterizations for Mesh Analysis Th. Athanasiadis and I. Fudos, Greece Introduction Mesh parameterization is a powerful geometry processing tool Applications Remeshing
More informationNetwork optimization: An overview
Network optimization: An overview Mathias Johanson Alkit Communications 1 Introduction Various kinds of network optimization problems appear in many fields of work, including telecommunication systems,
More informationA (somewhat) Unified Approach to Semisupervised and Unsupervised Learning
A (somewhat) Unified Approach to Semisupervised and Unsupervised Learning Ben Recht Center for the Mathematics of Information Caltech April 11, 2007 Joint work with Ali Rahimi (Intel Research) Overview
More information3 Interior Point Method
3 Interior Point Method Linear programming (LP) is one of the most useful mathematical techniques. Recent advances in computer technology and algorithms have improved computational speed by several orders
More informationLecture 9 - Matrix Multiplication Equivalences and Spectral Graph Theory 1
CME 305: Discrete Mathematics and Algorithms Instructor: Professor Aaron Sidford (sidford@stanfordedu) February 6, 2018 Lecture 9 - Matrix Multiplication Equivalences and Spectral Graph Theory 1 In the
More informationLec13p1, ORF363/COS323
Lec13 Page 1 Lec13p1, ORF363/COS323 This lecture: Semidefinite programming (SDP) Definition and basic properties Review of positive semidefinite matrices SDP duality SDP relaxations for nonconvex optimization
More informationADVANCED MACHINE LEARNING MACHINE LEARNING. Kernel for Clustering kernel K-Means
1 MACHINE LEARNING Kernel for Clustering ernel K-Means Outline of Today s Lecture 1. Review principle and steps of K-Means algorithm. Derive ernel version of K-means 3. Exercise: Discuss the geometrical
More informationFMA901F: Machine Learning Lecture 3: Linear Models for Regression. Cristian Sminchisescu
FMA901F: Machine Learning Lecture 3: Linear Models for Regression Cristian Sminchisescu Machine Learning: Frequentist vs. Bayesian In the frequentist setting, we seek a fixed parameter (vector), with value(s)
More informationLecture 27: Fast Laplacian Solvers
Lecture 27: Fast Laplacian Solvers Scribed by Eric Lee, Eston Schweickart, Chengrun Yang November 21, 2017 1 How Fast Laplacian Solvers Work We want to solve Lx = b with L being a Laplacian matrix. Recall
More informationB553 Lecture 12: Global Optimization
B553 Lecture 12: Global Optimization Kris Hauser February 20, 2012 Most of the techniques we have examined in prior lectures only deal with local optimization, so that we can only guarantee convergence
More informationA Novel Image Super-resolution Reconstruction Algorithm based on Modified Sparse Representation
, pp.162-167 http://dx.doi.org/10.14257/astl.2016.138.33 A Novel Image Super-resolution Reconstruction Algorithm based on Modified Sparse Representation Liqiang Hu, Chaofeng He Shijiazhuang Tiedao University,
More informationCHAPTER SIX. the RRM creates small covering roadmaps for all tested environments.
CHAPTER SIX CREATING SMALL ROADMAPS Many algorithms have been proposed that create a roadmap from which a path for a moving object can be extracted. These algorithms generally do not give guarantees on
More informationNatural Language Processing
Natural Language Processing Classification III Dan Klein UC Berkeley 1 Classification 2 Linear Models: Perceptron The perceptron algorithm Iteratively processes the training set, reacting to training errors
More informationIntroduction to Dynamic Traffic Assignment
Introduction to Dynamic Traffic Assignment CE 392D January 22, 2018 WHAT IS EQUILIBRIUM? Transportation systems involve interactions among multiple agents. The basic facts are: Although travel choices
More informationData Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University
Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University Search & Optimization Search and Optimization method deals with
More informationGeometric Mean Algorithms Based on Harmonic and Arithmetic Iterations
Geometric Mean Algorithms Based on Harmonic and Arithmetic Iterations Ben Jeuris and Raf Vandebril KU Leuven, Dept. of Computer Science, 3001 Leuven(Heverlee), Belgium {ben.jeuris,raf.vandebril}@cs.kuleuven.be
More informationCentrality Book. cohesion.
Cohesion The graph-theoretic terms discussed in the previous chapter have very specific and concrete meanings which are highly shared across the field of graph theory and other fields like social network
More informationComputational Methods. Constrained Optimization
Computational Methods Constrained Optimization Manfred Huber 2010 1 Constrained Optimization Unconstrained Optimization finds a minimum of a function under the assumption that the parameters can take on
More informationOptimization of FEM solver for heterogeneous multicore processor Cell. Noriyuki Kushida 1
Optimization of FEM solver for heterogeneous multicore processor Cell Noriyuki Kushida 1 1 Center for Computational Science and e-system Japan Atomic Energy Research Agency 6-9-3 Higashi-Ueno, Taito-ku,
More informationBidirectional search and Goal-directed Dijkstra
Computing the shortest path Bidirectional search and Goal-directed Dijkstra Alexander Kozyntsev October 18, 2010 Abstract We study the problem of finding a shortest path between two vertices in a directed
More informationOptimizing Data Locality for Iterative Matrix Solvers on CUDA
Optimizing Data Locality for Iterative Matrix Solvers on CUDA Raymond Flagg, Jason Monk, Yifeng Zhu PhD., Bruce Segee PhD. Department of Electrical and Computer Engineering, University of Maine, Orono,
More informationSDLS: a Matlab package for solving conic least-squares problems
SDLS: a Matlab package for solving conic least-squares problems Didier Henrion 1,2 Jérôme Malick 3 June 28, 2007 Abstract This document is an introduction to the Matlab package SDLS (Semi-Definite Least-Squares)
More informationResearch Interests Optimization:
Mitchell: Research interests 1 Research Interests Optimization: looking for the best solution from among a number of candidates. Prototypical optimization problem: min f(x) subject to g(x) 0 x X IR n Here,
More informationCPSC 340: Machine Learning and Data Mining. Principal Component Analysis Fall 2017
CPSC 340: Machine Learning and Data Mining Principal Component Analysis Fall 2017 Assignment 3: 2 late days to hand in tonight. Admin Assignment 4: Due Friday of next week. Last Time: MAP Estimation MAP
More information1. Lecture notes on bipartite matching February 4th,
1. Lecture notes on bipartite matching February 4th, 2015 6 1.1.1 Hall s Theorem Hall s theorem gives a necessary and sufficient condition for a bipartite graph to have a matching which saturates (or matches)
More informationTRAVELTIME TOMOGRAPHY (CONT)
30 March 005 MODE UNIQUENESS The forward model TRAVETIME TOMOGRAPHY (CONT di = A ik m k d = Am (1 Data vecto r Sensitivit y matrix Model vector states the linearized relationship between data and model
More informationLecture 9: (Semi-)bandits and experts with linear costs (part I)
CMSC 858G: Bandits, Experts and Games 11/03/16 Lecture 9: (Semi-)bandits and experts with linear costs (part I) Instructor: Alex Slivkins Scribed by: Amr Sharaf In this lecture, we will study bandit problems
More informationREGRESSION ANALYSIS : LINEAR BY MAUAJAMA FIRDAUS & TULIKA SAHA
REGRESSION ANALYSIS : LINEAR BY MAUAJAMA FIRDAUS & TULIKA SAHA MACHINE LEARNING It is the science of getting computer to learn without being explicitly programmed. Machine learning is an area of artificial
More informationChapter 5 Supercomputers
Chapter 5 Supercomputers Part I. Preliminaries Chapter 1. What Is Parallel Computing? Chapter 2. Parallel Hardware Chapter 3. Parallel Software Chapter 4. Parallel Applications Chapter 5. Supercomputers
More informationAnimation Lecture 10 Slide Fall 2003
Animation Lecture 10 Slide 1 6.837 Fall 2003 Conventional Animation Draw each frame of the animation great control tedious Reduce burden with cel animation layer keyframe inbetween cel panoramas (Disney
More informationSemi-Oblivious Traffic Engineering:
Semi-Oblivious Traffic Engineering: The Road Not Taken Praveen Kumar (Cornell) Yang Yuan (Cornell) Chris Yu (CMU) Nate Foster (Cornell) Robert Kleinberg (Cornell) Petr Lapukhov (Facebook) Chiun Lin Lim
More informationCONLIN & MMA solvers. Pierre DUYSINX LTAS Automotive Engineering Academic year
CONLIN & MMA solvers Pierre DUYSINX LTAS Automotive Engineering Academic year 2018-2019 1 CONLIN METHOD 2 LAY-OUT CONLIN SUBPROBLEMS DUAL METHOD APPROACH FOR CONLIN SUBPROBLEMS SEQUENTIAL QUADRATIC PROGRAMMING
More informationECE 901: Large-scale Machine Learning and Optimization Spring Lecture 13 March 6
ECE 901: Large-scale Machine Learning and Optimization Spring 2018 Lecture 13 March 6 Lecturer: Dimitris Papailiopoulos Scribe: Yuhua Zhu & Xiaowu Dai Note: These lecture notes are still rough, and have
More information2. True or false: even though BFS and DFS have the same space complexity, they do not always have the same worst case asymptotic time complexity.
1. T F: Consider a directed graph G = (V, E) and a vertex s V. Suppose that for all v V, there exists a directed path in G from s to v. Suppose that a DFS is run on G, starting from s. Then, true or false:
More information