Modern Exact and Approximate Combinatorial Optimization Algorithms: Max Product and Max Sum product

Similar documents
From Exact to Anytime Solutions for Marginal MAP

AND/OR Cutset Conditioning

The Relationship Between AND/OR Search and Variable Elimination

Advancing AND/OR Search for Optimization Using Diverse Principles

Compiling Constraint Networks into AND/OR Multi-Valued Decision Diagrams (AOMDDs)

Best-First AND/OR Search for Graphical Models

AND/OR Multi-Valued Decision Diagrams (AOMDDs) for Graphical Models

Advances in Parallel Branch and Bound

Best-First AND/OR Search for Most Probable Explanations

Stochastic Anytime Search for Bounding Marginal MAP

AND/OR Multi-Valued Decision Diagrams for Constraint Networks

From Exact to Anytime Solutions for Marginal MAP

Anytime AND/OR Depth-first Search for Combinatorial Optimization

Pushing Forward Marginal MAP with Best-First Search

Applying Search Based Probabilistic Inference Algorithms to Probabilistic Conformant Planning: Preliminary Results

ALGORITHMS FOR DECISION SUPPORT

Search Algorithms for Solving Queries on Graphical Models & the Importance of Pseudo-trees in their Complexity.

Semi-Independent Partitioning: A Method for Bounding the Solution to COP s

Machine Learning Lecture 16

COMP 8620 Advanced Topics in AI

Mini-Buckets: A General Scheme for Generating Approximations in Automated Reasoning

A General Scheme for Automatic Generation of Search Heuristics from Specification Dependencies. Kalev Kask and Rina Dechter

Winning the PASCAL 2011 MAP Challenge with Enhanced AND/OR Branch-and-Bound

AND/OR Search for Marginal MAP

UNIVERSITY OF CALIFORNIA, IRVINE. Advancing Heuristics for Search over Graphical Models DISSERTATION

AND/OR Graph Search for Genetic Linkage Analysis

Inference Complexity As Learning Bias. The Goal Outline. Don t use model complexity as your learning bias

D-Separation. b) the arrows meet head-to-head at the node, and neither the node, nor any of its descendants, are in the set C.

Anytime Anyspace AND/OR Best-first Search for Bounding Marginal MAP

Computer Vision Group Prof. Daniel Cremers. 4a. Inference in Graphical Models

Towards Parallel Search for Optimization in Graphical Models

On the Practical Significance of Hypertree vs Tree Width

Anytime Best+Depth-First Search for Bounding Marginal MAP

Warm-up as you walk in

Look-ahead with Mini-Bucket Heuristics for MPE

Evaluating Weighted DFS Branch and Bound over Graphical Models

CS242: Probabilistic Graphical Models Lecture 3: Factor Graphs & Variable Elimination

Generating Random Solutions for Constraint Satisfaction Problems

An iteration of Branch and Bound One iteration of Branch and Bound consists of the following four steps: Some definitions. Branch and Bound.

Weighted heuristic anytime search: new schemes for optimization over graphical models

Markov Random Fields

Probabilistic Graphical Models

CS 343: Artificial Intelligence

Constraint-based Testing

Efficient AND/OR Search Algorithms for Exact MAP Inference Task over Graphical Models

Abstraction Sampling in Graphical Models

Processing Boolean queries over Belief networks. Rina Dechter and Padhraic Smyth.

CS 188: Artificial Intelligence

Anytime Anyspace AND/OR Search for Bounding the Partition Function

A Co-Clustering approach for Sum-Product Network Structure Learning

Computer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models

Mixed deterministic and probabilistic networks

Probabilistic Graphical Models

Search. Search Trees

Part II. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS

Combined Code and Data Minimization Algorithms

On the Time Complexity of Bucket. Javier Larrosa. January 23, Abstract. In this short note, we prove the time complexity of full-bucket and

STAT 598L Probabilistic Graphical Models. Instructor: Sergey Kirshner. Exact Inference

Evaluating the Impact of AND/OR Search on 0-1 Integer Linear Programming

Analysis of Algorithms Prof. Karen Daniels

Lecture 5: Exact inference. Queries. Complexity of inference. Queries (continued) Bayesian networks can answer questions about the underlying

Topological parameters for time-space tradeoff

Lecture overview. Knowledge-based systems in Bioinformatics, 1MB602, Goal based agents. Search terminology. Specifying a search problem

Belief propagation in a bucket-tree. Handouts, 275B Fall Rina Dechter. November 1, 2000

mywbut.com Uninformed Search

Class2: Constraint Networks Rina Dechter

The wolf sheep cabbage problem. Search. Terminology. Solution. Finite state acceptor / state space

Probabilistic Graphical Models

DJAO: A Communication-Constrained DCOP algorithm that combines features of ADOPT and Action-GDL

Decomposition of log-linear models

An Introduction to LP Relaxations for MAP Inference

The Size Robust Multiple Knapsack Problem

Integrating Probabilistic Reasoning with Constraint Satisfaction

Introduction to Graphical Models

Computational Geometry

Topological Parameters for time-space tradeo. Rina Dechter. University of California, Irvine.

IP Forwarding Computer Networking. Routes from Node A. Graph Model. Lecture 10: Intra-Domain Routing

Learning decomposable models with a bounded clique size

Selecting DaoOpt solver configuration for MPE problems

How do we get from what we know (score of data given a tree) to what we want (score of the tree, given the data)?

OSU CS 536 Probabilistic Graphical Models. Loopy Belief Propagation and Clique Trees / Join Trees

Bayesian Networks and Decision Graphs

Set 5: Constraint Satisfaction Problems

Informed Search Methods

Computer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models

Iterative Join-Graph Propagation

Kalev Kask. Abstract. This paper evaluates the power of a new scheme that generates

Part I: Sum Product Algorithm and (Loopy) Belief Propagation. What s wrong with VarElim. Forwards algorithm (filtering) Forwards-backwards algorithm

Iterative Algorithms for Graphical Models 1

Information Processing Letters

Constraint satisfaction search. Combinatorial optimization search.

STA 4273H: Statistical Machine Learning

Machine Learning. Sourangshu Bhattacharya

UNIVERSITY OF CALIFORNIA, IRVINE. Approximate Inference in Graphical Models DISSERTATION

Advanced Digital Logic Design EECS 303

Example: Bioinformatics. Soft Constraint Processing. From Optimal CSP to Soft CSP. Overview. From Optimal CSP to Soft CSP.

Lecture 5: Exact inference

Systematic vs. Non-systematic Algorithms for Solving the MPE Task

Semiring-Based Mini-Bucket Partitioning Schemes

Set 5: Constraint Satisfaction Problems Chapter 6 R&N

Transcription:

Modern xact and pproximate ombinatorial Optimization lgorithms: Max Product and Max Sum product In the pursuit of universal solver Rina echter onald ren school of IS, University of alifornia, Irvine Main collaborators: Radu Marinescu Lars Otten lex Ihler Kalev Kask Robert Mateescu ISIM 26

How to esign a Good Solver Heuristic Search The core of a good search algorithm compact search space good heuristic evaluation function good traversal strategy nytime search yields a good approximation. 2

ounding from bove and elow Relaxation upper bound or a maximization probelm MP onsistent solutions Relaxation provides upper bound ny configuration: lower bound 3

Outline Graphical models, Queries, lgorithms Inference lgorithms: bucket elimination ounded Inference: a) mini bucket, b) cost shifting N/OR search spaces and N/OR n valuation, Software Summary and future work 4

Outline Graphical models, Queries, lgorithms Inference lgorithms: bucket elimination ounded Inference: a) mini bucket, b) cost shifting N/OR search spaces and N/OR n valuation, Software onclusions 5

Graphical Models P() R() P( ) R(,) P( ) R(,) P(,) R(,,) P(,) P(G,) R(,,) R(,,G) G G a) elief network (directed) b) onstraint network (undirected) Test Test cost rill cost Oil sales () () (, ) Test result rill Oil produced Oil sale policy G H Seismic structure Oil underground Sales cost Market information I J K L c) Influence diagram d) Markov network 6

Many Many applications 7

Linkage nalysis: Pedigree: 6 People, 3 Markers L m L f L 2m L 2f constraints X X 2 S 3m Probabilistic relations S 5m L 3m X 3 L 5m L 3f L 5f S 5m L 6m L 4m X 4 L 6f L 4f S 5m S 5m S 6m X 5 X 6 L 2m L 2f L 22m L 22f X 2 X 22 S 23m L 23m L 23f S 25m L 24m L 24f X 23 X 24 S 25m L 25m L 25f L 26m L 26f S 25m S 25m S 26m X 25 X 26 L 3m L 3f L 32m L 32f X 3 X 32 S 33m L 33m L 33f S 35m L 34m L 34f X 33 X 34 S 35m L 35m L 35f L 36m L 36f S 35m S 35m S 36m (Slide, Thanks to Geiger) X 35 X 36 8

Graphical Models graphical model (X,,): X = {X, X n } = {, n } = {f,,f r } functions Operators: (constraints, PTS, Ns ) variables domains combination : Sum, product, join limination: projection, sum, max/min Tasks: elief updating: X\Y j P i MP\MP: max X j P j Marginal MP: max Y \ P j j SP: X j j Max SP: min X j j onditional Probability Table (PT) Relation P(,).4.96.4.6.35.65.72.68 red green blue blue red red blue blue green green red blue ll these tasks are NP-hard exploit problem structure identify special cases approximate f i : ( ) ( Primal graph (interaction graph)

Why Marginal MP? Often, Marginal MP is the right task: We have a model describing a large system We care about predicting the state of some part xample: decision making Sum over random variables (random effects, etc.) Max over decision variables (specify action policies) Sensor network Influence diagram:

Tree solving is easy elief updating (sum-prod) m XY (X ) m YX (X ) P(X) m XZ (X ) m ZX (X ) SP consistency (projection-join) P(Y X) m YT (Y ) m YR (Y ) m TY (Y ) m RY (Y ) P(Z X) m ZL (Z) m ZM (Z) m LZ (Z ) m MZ (Z) Marginal Map is not asy even for trees P(T Y) P(R Y) P(L Z) P(M Z) MP (max-prod) #SP (sum-prod) Trees are processed in linear time and memory

Inference vs onditioning Search G G H K M H Inference exp(w*) time/space J L HK HJ KLM Search xp(n) time O(n) space M L K H G J =yellow =green =blue =green =red =blue M M M M L L L L K K K K H H H H G J G J G J G J Search+inference: Space: exp(w) Time: exp(w+c(w)) w: user controlled 2

Inference vs conditioning search G G H K M H Inference exp(w*) time/space J L HK HJ KLM OR N Search OR N OR N OR N ontext minimal N/OR search graph 8 N nodes xp(w*) time O(w*) space M L K H G J =yellow =green =blue =green =red =blue M M M M L L L L K K K K H H H H G J G J G J G J Search+inference: Space: exp(q) Time: exp(q+c(q)) q: user controlled 3

Outline Graphical models, Queries, lgorithms Inference lgorithms ounded Inference: a) mini bucket, b) cost shifting N/OR search spaces and N/OR n valuation, Software onclusions 4

Likelihood queries: Inference (sum-product) 5

inding Marginals by ucket elimination lgorithm bel (echter 996) P( ) P( ) P( ) P( ) P(,,,, b bucket : bucket : bucket : bucket : bucket : P(a e=) P(b a) P(d b,a) P(e b,c) P(c a) e= P(a) P(e=) (a, (a, d, (a, d, (a) e) e) c, limination operator e) W*=4 induced width (max clique size) ) P(, )

inding Marginals by ucket elimination lgorithm bel (echter 996) P( ) P( ) P( ) P( ) P(,,,, b bucket : bucket : bucket : P(b a) P(d b,a) P(e b,c) O(n ) e= (a, e) P(a) P(e=) (a) limination operator Time and space exponential in the bucket : P(c a) (a, d, c, e) induced-width / treewidth bucket : (a, d, e) P(a e=) W*=4 induced width (max clique size) ) P(, )

Optimization queries: Inference 8

inding MP by ucket limination lgorithm -mpe (echter 996, ertele max and P( briochi, a) P( 977) d b, a) P( e b, c) MP bucket : bucket : bucket : bucket : bucket : max a, e, d, c, b P( a) P( c P(c a) P(a) OPT max X P(b a) P(d b,a) P(e b,c) e= a) P( b h (a, d, h (a,e) h (a) a) P( d c, h (a,d,e) b e) a, b) P( e b, c) W*=4 induced width (max clique size) 9

Generating the MP tuple 5. b' arg max P(b a' ) b P(d' b, a' ) P(e' b,c' ) : P(b a) P(d b,a) P(e b,c) 4. 3. 2. e'. c' arg max P(c a' c h (a',d',c, e' ) d' a' arg max h arg max P(a) h a d (a',d,e' (a) ) ) : : : P(c a) : P(a) h (a,d,c, e) h (a,d,e) e= h (a, e) h (a) Return (a',b',c',d',e' ) 2

The Induced Width/Treewidth w * ( d ) the induced width of graph along ordering d The effect of the ordering: Moral graph * * w ( d ) 4 w ( d 2 ) 2 2

Marginal Map: Inference 22

Variable limination ( max sum product) Pure MP or summation tasks ynamic programming x: efficient on trees Marginal MP Operations do not commute: Sum must be done first! 23 lexander Ihler PPRIL Site Visit, 4/25

ucket limination for MMP ucket limination : constrained elimination MX order SUM : : : : Interleaving MX and SUM variables yields an upper bound [echter, 999] MP* is the marginal MP value

ucket limination for MMP ucket limination exact upper bound constrained elimination order unconstrained elimination order [echter, 999] In practice, constrained induced is much larger!

Outline Graphical models, Queries, lgorithms Inference lgorithms: bucket elimination ounded Inference: a) mini bucket, b) cost shifting N/OR search spaces and N/OR n valuation, Software onclusions 26

Mini bucket pproximation: Relaxation (echter and Rish, 997, 23) Split a bucket into mini-buckets =>bound complexity X X h X g X xponential complexity decrease : O(e n ) O( e r ) O( e n r ) 27

Mini ucket limination max max Node duplication, renaming ucket P(,) P( ) P(,) P() ucket P( ) h (,) P( ) P( ) ucket h (,) ucket = h (,) P(,) P(,) ucket P() h () h () W=2 MP* is an upper bound on MP --U Generating a solution yields a lower bound--l 28

Properties of Mini ucket liminaton omplexity: O(r exp(i)) time and O(exp(i)) space. ccuracy: determined by Upper/Lower bound. s i increases, both accuracy and complexity increase. Possible use of mini bucket approximations: s anytime algorithms s heuristics in search 3

ounding from above and below Relaxation upper bound by mini-bucket i=2 i=5 i= i=2 MP onsistent solutions ( greedy search) Relaxation provides upper bound ny configuration: lower bound 3

Outline Graphical models, Queries, lgorithms Inference lgorithms: bucket elimination ounded Inference: a) mini bucket, b) cost shifting N/OR search spaces and N/OR n valuation, Software onclusions 32

33

x: ual ecomposition Reparameterization: ound solution using decomposed optimization Solve independently: optimistic bound xact if all copies agree Tighten the bound by reparameterization nforces lost equality constraints using Lagrange multipliers dd factors that adjust each local term, but cancel out in total 34

x: ual ecomposition Reparameterization: 35

Various Update Schemes an use any decomposition updates (message passing, subgradient, augmented, etc.) GLP: Update the original factors JGLP: Update clique function of the join graph M-MM Update within each bucket only 36

JGLP: ixed Point Updates JGLP: Update clique function of the join graph Use M to generate the join graph efine function i for each clique (mini-bucket) q i Update each edge over separator set ucket max P(,) max P(,)P( ) ucket P( ) h (,) ucket h (,) ucket = h (,) ucket P() h () h () 37

M MM: M with Moment Matching ucket ucket P(,) P( ) max m m 2 h (,) max P( ) P(,) m,m 2 - moment-matching messages P( ) P() P( ) ucket h (,) ucket = h (,) P(,) P(,) ucket P() h () h () W=2 MP* is an upper bound on MP --U Generating a solution yields a lower bound--l 38

39

Iterative Tightening as ounding Schemes 4

ounded Inference for sum-product and max-sum-oproduct Inference: Mini-bucket and weighted mini-bucket 4

Marginal (sum) and Marginal MP (max-sum-product) Partition a bucket into mini-buckets with i variables : MX SUM : : : : [echter and Rish, 2] MP* is an upper bound on the marginal MP value

43

WM on Join Graphs WM-JG MX SUM : : : : : WM defines a join-graph Propagate downward / upward messages until convergence ownward messages e.g. from to Upward messages e.g. from to used during cost-shifting ost-shifting within a bucket In practice, yields a much tighter upper bound than WM

MMP: Quality of the Upper ounds verage relative error wrt tightest upper bound. iterations for WM-JG(i).

ounding from bove and elow Relaxation upper bound: WM+cost-shifting Marginal MP onsistent solutions (greedy search) Relaxation provides upper bound ny configuration: lower bound but NPhard 46

Outline Graphical models, Queries, lgorithms Inference lgorithms ounded Inference: mini bucket, cost shifting N/OR search spaces and N/OR n valuation, Software onclusions 47

OR OR vs N/OR Search N OR N OR Pseudo-tree N/OR N OR N N/OR size: exp(4), OR size exp(6) OR

OR OR vs N/OR Search N OR N OR Pseudo-tree N/OR N OR N N/OR size: exp(4), OR size exp(6) OR

OR OR vs N/OR Search N OR N OR Time O( ) N OR Space O(n) N N/OR size: exp(4), height is bounded by w* log n OR size exp(6) Pseudo-tree N/OR OR

OR N/OR vs. OR with onstraints N ( OR N OR N N/OR OR N OR

rom Search Trees to Search Graphs ny two nodes that root identical sub trees or sub graphs can be merged 52

rom N/OR Tree J K OR N G H G H J K OR N OR N OR N OR G G J J G G J J G G J J G G J J G G J J G G J J G G J J G G J J N OR H H H H K K K K H H H H K K K K H H H H K K K K H H H H K K K K H H H H K K K K H H H H K K K K H H H H K K K K H H H H K K K K N 53

n N/OR Graph J K OR N G H G H J K OR N OR N OR N OR G G J J N OR H H H H K K K K N 54

Merging based on context context (X) = ancestors of X connected to X descendants of X [ ] [] [] [] [] []

ounting Solutions: Sum Product Networks solutions OR N 5 6 OR 5 6 N OR 2 4 N 2 OR 2 2 N 2 OR node: Marginalization operator (summation) 4 N node: ombination operator 2 (product) 2 2 2 4 2 2 Value of node = number of solutions below it

Weighted N/OR Search Space.6.4.4.6..9.4.2.8.5.2.8.7..9.2..9.8.9.7.5.8.9.7.5.6.4 N/OR tree [] P( ) = =.4.6..9 P( ) = =.2.8.7.3 ontext [ ] [] [] [] P(, ) = =.4.6.5.5.7.3.2.8 P() P().6.4 vidence: =.4.6..9.4.5.7.2.2..8.9.2..8.9 SP2.9.8.7.5 N/OR context-minimal Graph

Weighted N/OR Search Space.6.4.4.6..9.4.2.8.5.2.8.7..9.2..9.8.9.7.5.8.9.7.5.6.4 N/OR tree [] P( ) = =.4.6..9 P( ) = =.2.8.7.3 ontext [ ] [] [] [] P(, ) = =.4.6.5.5.7.3.2.8 P() P().6.4 vidence: =.4.6..9 Weighted N/OR Multi-Valued ecision iagram: Reducing the N/OR ontext Minimal graph.4 to a canonical.5.7 unique and minimal weighted OM:.2.2..8.9.2..8.9 SP2.9.8.7.5 N/OR context-minimal Graph

nswering Queries: Sum Product pplying Sum-Product is easy Result: P(=,=).2448.6.4.328.559.328.559.4.6..9.352.27.623.4.4.5.7.2.88.54.89.52.4.5.7.2.2..8.9.2..8.9.8.9.7.5.8.9.7.5.9.8.7.5 P( ) = =.4.6..9 P( ) = =.2.8.7.3 P(, ) = =.4.6.5.5.7.3.2.8 P() P().6.4 ontext [ ] [] [] [] []

nswering Queries: Sum Product(elief Updating) P(, ) = =.4.6.5.5.7.3.2.8 vidence: = P( ) = =.4.6..9 P( ) = =.2.8.7.3 P() P().6.4 Result: P(=,=).2448.6.4 [] ontext [ ] [] [] [].328.559.328.559.4.6..9.352.27.623.4.4.88.5.54.7.89.2.52.4.2.8.5.2.8.7..9.2..9.8.9.7.5.8.9.7.5 Value.8.9.7. ache table for.8.9.7.5.8.9.7.5 P(, ) = =.2.8..9.3.7.5.5 vidence: =

Pseudo Trees (reuder 85, ayardo 95, odlaender and Gilbert, 9) 4 6 2 3 2 7 5 3 h <= w* log n (a) Graph 4 7 2 7 3 5 5 3 5 4 2 7 4 6 6 6 (b) S tree depth=3 (c) pseudo- tree depth=2 (d) hain depth=6

ll our Search Spaces ull OR search tree 26 nodes ontext minimal OR search graph 28 nodes OR N OR N OR N OR N ull N/OR search tree 54 N nodes OR N OR N OR N OR N ontext minimal N/OR search graph 8 N nodes ny query is best computed Over the c-minimal O space 62

ll our Search Spaces Space O(n k w* ) O(n k pw* ) N/OR graph ull OR search tree 26 nodes OR graph Time O(n k w* ) O(n k pw* ) ontext minimal OR search graph 28 nodes OR N OR N OR omputes any query: onstraint satisfaction N Optimization (MP) OR N Weighted counting (P(e)) ull Marginal N/OR search map tree 54 N nodes OR N OR N OR N OR N ontext minimal N/OR search graph 8 N nodes ny query is best computed Over the c-minimal O space 63

MP: N/OR n search 64

asic Heuristic Search Schemes Heuristic function f(x p ) computes a lower bound on the best extension of x p and can be used to guide a heuristic search algorithm. We focus on:. ranch-and-ound Use heuristic function f(x p ) to prune the depth-first search tree Linear space 2. est-irst Search lways expand the node with the highest heuristic value f(x p ) needs lots of memory f L L 65

lassic ranch and ound ach node is a OP subproblem (defined by current conditioning) g(n) f(n) = g(n) + h(n) f(n) = lower bound n Prune if f(n) U h(n) - under-estimates Optimal cost below n (U) Upper ound = best solution so far 66

Mini bucket Heuristics for search ( Kask and dechterij, 2, Kask, echter and Marinescu 24, 25, 29, Otten 22) a e h(x) computed by M(i) before or during search : P(,) P(,)P( ) : P( ) h (,) : h (,) L : : h (,) P() h () h () f(a,e,) = P(a) h (,a) h (e,a) 67

N/OR ranch and ound ecomposition of independent subproblems v= 2 Prune based v=+2=2 on current best solution and heuristic estimate (mini-bucket heuristic). h=4 [Marinescu & echter, IJ'9] ache table for (independent of ) cost 6...... 68

MP: nytime, n est irst, Recursive est irst nytime: readth Rotate N/OR n Weighted heuristic N/OR search inding m best solutions 69

Marginal Map: N/OR n Search N/OR n over the appropriate search space Guided by weighted mini-bucket +cost-shiifting 7

N/OR Search Spaces for MMP H MP variables G primal graph G SUM variables H constrained pseudo tree [Marinescu, echter and Ihler, 24] 7

N/OR Search Spaces for MMP G H H G G G H H Node types OR (MP): max OR (SUM): sum N: multiplication rc weights erived from input Problem decomposition over MP variables 72

N/OR Search for Marginal MP O: epth-irst N/OR ranch and ound epth-first traversal of the N/OR search graph Prune only at OR nodes that correspond to MP variables ost of MP assignment obtained by searching the SUM sub-problem O: est irst N/OR Search est-first (O*) traversal of the N/OR space corresponding to the MP variables SUM subproblem solved exactly ROO: Recursive est-irst N/OR Search Recursive best-first traversal of the N/OR graph or SUM subproblems, the threshold is set to (equivalent to depth-first search) 73

N/OR Search for Marginal MP O: epth-irst N/OR ranch and ound epth-first traversal of the N/OR search graph Prune only at OR nodes that correspond to MP variables ost of MP lso assignment nytime obtained Marginal by searching the SUM sub-problem Map solvers O: est irst N/OR Search est-first (O*) traversal of the N/OR space corresponding to the MP variables SUM subproblem solved exactly ROO: Recursive est-irst N/OR Search Recursive best-first traversal of the N/OR graph or SUM subproblems, the threshold is set to (equivalent to depth-first search) 74

Outline Graphical models, Queries Inference lgorithms ounding Inference schemes (mini bucket, reparameterization) N/OR search valuation, Software onclusions and recent work: Parallelism, m best, weighted best first, marginal map, tree SLS) 75

76

77

mpirical valuation: Haplotype problems Time bound 24 h ar Ilan, 6/7/25

mpirical valuation: Haplotype problems Time bound 24 h ar Ilan, 6/7/25

Marginal Map results 8

Summary and uture work I have shown the primal graph of a graphical model can: suggest effective relaxation and heuristics Suggest a compact search graph an accommodate vibrational schemes Yields one of the best solvers ILP and oolean methods can be included. urrent work: anytime scheme minimizing the upperlower gap at termination anytime nytime summation solvers 8

Thanks You! or publication see: http://www.ics.uci.edu/~dechter/publications.html Thank you Kalev Kask Irina Rish ozhena idyuk Robert Mateescu Radu Marinescu Vibhav Gogate mma Rollon Lars Otten Natalia lerova ndrew Gelfand William Lam Junkyu Lee 82