Lecture: Iterative Search Methods

Similar documents
Example: Map coloring

Set 5: Constraint Satisfaction Problems Chapter 6 R&N

Speeding Up the ESG Algorithm

Set 5: Constraint Satisfaction Problems

Module 4. Constraint satisfaction problems. Version 2 CSE IIT, Kharagpur

Kalev Kask and Rina Dechter. Department of Information and Computer Science. University of California, Irvine, CA

mywbut.com Informed Search Strategies-II

GSAT and Local Consistency

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Massively Parallel Seesaw Search for MAX-SAT

A Re-examination of Limited Discrepancy Search

Recap Hill Climbing Randomized Algorithms SLS for CSPs. Local Search. CPSC 322 Lecture 12. January 30, 2006 Textbook 3.8

Constraint Satisfaction Problems

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Set 5: Constraint Satisfaction Problems

Satisfiability. Michail G. Lagoudakis. Department of Computer Science Duke University Durham, NC SATISFIABILITY

CMU-Q Lecture 8: Optimization I: Optimization for CSP Local Search. Teacher: Gianni A. Di Caro

Set 5: Constraint Satisfaction Problems

Constraint Satisfaction Problems

Kalev Kask and Rina Dechter

Foundations of AI. 8. Satisfiability and Model Construction. Davis-Putnam, Phase Transitions, GSAT and GWSAT. Wolfram Burgard & Bernhard Nebel

Chapter 27. Other Approaches to Reasoning and Representation

Local Search. (Textbook Chpt 4.8) Computer Science cpsc322, Lecture 14. May, 30, CPSC 322, Lecture 14 Slide 1

Other approaches. Marco Kuhlmann & Guido Tack Lecture 11

10/11/2017. Constraint Satisfaction Problems II. Review: CSP Representations. Heuristic 1: Most constrained variable

CS 331: Artificial Intelligence Local Search 1. Tough real-world problems

Non-deterministic Search techniques. Emma Hart

Artificial Intelligence

ABHELSINKI UNIVERSITY OF TECHNOLOGY

Artificial Intelligence

6.034 Notes: Section 3.1

Constraint Satisfaction Problems

CS-E3200 Discrete Models and Search

Heuristic Optimisation

Constraint Satisfaction Problems. slides from: Padhraic Smyth, Bryan Low, S. Russell and P. Norvig, Jean-Claude Latombe

Introduction to Artificial Intelligence 2 nd semester 2016/2017. Chapter 4: Beyond Classical Search

Today. CS 188: Artificial Intelligence Fall Example: Boolean Satisfiability. Reminder: CSPs. Example: 3-SAT. CSPs: Queries.

algorithms, i.e., they attempt to construct a solution piece by piece and are not able to offer a complete solution until the end. The FM algorithm, l

Local Search for CSPs

Outline of the module

Outline. Best-first search

Announcements. CS 188: Artificial Intelligence Spring Production Scheduling. Today. Backtracking Search Review. Production Scheduling

Simple mechanisms for escaping from local optima:

CS 188: Artificial Intelligence

Week 8: Constraint Satisfaction Problems

CS 771 Artificial Intelligence. Constraint Satisfaction Problem

EECS 219C: Computer-Aided Verification Boolean Satisfiability Solving. Sanjit A. Seshia EECS, UC Berkeley

space. We will apply the idea of enforcing local consistency to GSAT with the hope that its performance can

Local Search. (Textbook Chpt 4.8) Computer Science cpsc322, Lecture 14. Oct, 7, CPSC 322, Lecture 14 Slide 1

Note: In physical process (e.g., annealing of metals), perfect ground states are achieved by very slow lowering of temperature.

CS 188: Artificial Intelligence Spring Today

Ar#ficial)Intelligence!!

CS 188: Artificial Intelligence

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany, Course on Artificial Intelligence,

A Hybrid Schema for Systematic Local Search

Announcements. CS 188: Artificial Intelligence Fall Reminder: CSPs. Today. Example: 3-SAT. Example: Boolean Satisfiability.

CS 188: Artificial Intelligence Fall 2008

CS227: Assignment 1 Report

Spezielle Themen der Künstlichen Intelligenz

Beyond Classical Search: Local Search. CMPSCI 383 September 23, 2011

Conflict-based Statistics

The Island Confinement Method for Reducing Search Space in Local Search Methods

Stochastic greedy local search Chapter 7

Recap Randomized Algorithms Comparing SLS Algorithms. Local Search. CPSC 322 CSPs 5. Textbook 4.8. Local Search CPSC 322 CSPs 5, Slide 1

Lecture 9. Heuristic search, continued. CS-424 Gregory Dudek

CS 188: Artificial Intelligence

Outline. Best-first search. Greedy best-first search A* search Heuristics Local search algorithms

Local Search. CS 486/686: Introduction to Artificial Intelligence Winter 2016

TDDC17. Intuitions behind heuristic search. Recall Uniform-Cost Search. Best-First Search. f(n) =... + h(n) g(n) = cost of path from root node to n

Hill Climbing. Assume a heuristic value for each assignment of values to all variables. Maintain an assignment of a value to each variable.

A Graph-Based Method for Improving GSAT. Kalev Kask and Rina Dechter. fkkask,

Lecture 18. Questions? Monday, February 20 CS 430 Artificial Intelligence - Lecture 18 1

Iterative improvement algorithms. Today. Example: Travelling Salesperson Problem. Example: n-queens

TDDC17. Intuitions behind heuristic search. Best-First Search. Recall Uniform-Cost Search. f(n) =... + h(n) g(n) = cost of path from root node to n

Outline. Best-first search

3.6.2 Generating admissible heuristics from relaxed problems

Constraint Satisfaction Problems

AI Programming CS S-08 Local Search / Genetic Algorithms

What is Search For? CSE 473: Artificial Intelligence. Example: N-Queens. Example: N-Queens. Example: Map-Coloring 4/7/17

CS 188: Artificial Intelligence Fall 2011

Constraint Satisfaction Problems

Random Subset Optimization

Local Search. CS 486/686: Introduction to Artificial Intelligence

Copyright 2000, Kevin Wayne 1

Midterm Examination CS540-2: Introduction to Artificial Intelligence

Constraint Satisfaction Problems

EECS 219C: Formal Methods Boolean Satisfiability Solving. Sanjit A. Seshia EECS, UC Berkeley

Machine Learning for Software Engineering

Artificial Intelligence p.1/49. n-queens. Artificial Intelligence p.2/49. Initial state: the empty board or a board with n random

Recap Arc Consistency Example Local Search Hill Climbing. Local Search. CPSC 322 CSPs 4. Textbook 4.8. Local Search CPSC 322 CSPs 4, Slide 1

Constraint Satisfaction Problems

Lecture 6: Constraint Satisfaction Problems (CSPs)

Constraint Satisfaction Problems

4 INFORMED SEARCH AND EXPLORATION. 4.1 Heuristic Search Strategies

Beyond Classical Search

Constraint Satisfaction Problems. A Quick Overview (based on AIMA book slides)

Informed search algorithms. Chapter 4

Constraint Satisfaction

Incremental Breakout Algorithm with Variable Ordering

Transcription:

Lecture: Iterative Search Methods Overview Constructive Search is exponential. State-Space Search exhibits better performance on some problems. Research in understanding heuristic and iterative search methods. Topics Comparison of Iterative vs. Constructive search. Iterative Repair as state space search. Notion of local Neighbourhood Min-Conflicts heuristic in backtrack search. Simulated Annealing algorithm GSAT Randomized Algorithms October 19, 1998 Copyright 1998 by Bill Havens 1 of 26

Conflict: Iterative vs. Constructive Search Goal: Constructive Search Methods Explore the search space nondeterministically and systematically. Notion of extending a partial solution. Starting with the empty solution. On failure, backtracking to some previous partial solution. Complete (every possible solution found). Guarantees termination. Precludes use of heuristic knowledge to guide backtracking. October 19, 1998 Copyright 1998 by Bill Havens 2 of 26

Goal: Iterative Search Methods Explore the search space using local information about solutions. Starts with a total answer (which may not be a solution). Often a greedy algorithm is used to generate a starting point. Assumes a metric on states which gives a (usually) accurate measure of which direction to search. Iteratively improves the answer until either: 1. all constraints are satisfied; and/or 2. an acceptable best solution is found. Not guaranteed to terminate. Hence incomplete October 19, 1998 Copyright 1998 by Bill Havens 3 of 26

Notion of Neighbourhood p A neighbour of some point X D 1... D n is another point Y D 1... D n such that X Y but Y is near to X under some metric. Possible Metrics Geometric Distance: a neighbour Y is any point such that X - Y < d where d is some maximum distance. Hamming Distance: point Y is a maximum of d-bits different than point X. (see Limited Discrepancy Search later) NonGeometric: Other neighbourhood topologies (research topic). Heuristic: Prior knowledge of the connectedness of the solution space in the neighbourhood of X. October 19, 1998 Copyright 1998 by Bill Havens 4 of 26

Smoothness of Solution Space Assumption: if X is a solution then X ± ε is probably also a solution. Hill-Climbing depends on the gradient being defined in the neighbourhood of X. Consider the following... October 19, 1998 Copyright 1998 by Bill Havens 5 of 26

Hill Climbing Neighbourhood X 4 F objective Y Algorithm iterates uphill towards solution At point #4, moves towards a false maxima. No escape possible!! m What can we do? Y October 19, 1998 Copyright 1998 by Bill Havens 6 of 26

Variable and Value Ordering Heuristics Variable Ordering: Which variable should be chosen next? All variables have to be assigned => Pick the toughest first Called the First-fail Principle. m Secretary s dilemma: How to schedule a meeting of many busy executives? October 19, 1998 Copyright 1998 by Bill Havens 7 of 26

Value Ordering: Which value for the chosen variable should be tried? 1. Least-commitment: pick a value such that the resulting CSP is likely to have a solution 2. Greedy: pick a value that minimizes / maximizes the objective function. 3. Min-conflict: pick a value which violates the least number of constraints on the variable. More on Min-conflict heuristic... October 19, 1998 Copyright 1998 by Bill Havens 8 of 26

Min-Conflicts Heuristic Reference: M. Johnston & S. Minton, Analysing a Heuristic Strategy for Constraint Satisfaction and Scheduling, in M. Zweben & M. Fox (eds.) Intelligent Scheduling, Morgan Kaufmann, 1994, p.257-289. Basic Idea Given: A CSP with binary constraints and a total (but inconsistent) assignment of variables to values. Two variables X and Y conflict if their assigned values violate a constraint C. Procedure: 1. Select any conflicting variable and assign it a new value that minimizes the number of conflicts with other related variables. 2. Break ties randomly. October 19, 1998 Copyright 1998 by Bill Havens 9 of 26

Example C1 X C6 Y C3 C4 C2 C5 X = a 1 ( C 6, C 1 ) X = a 2 ( C 1 ) X = a 3 ( C 6, C 1, C 2 ) Y = b 1 ( C 6, C 3 ) Y = b 2 ( C 3, C 4 ) Y = b 3 ( C 6, C 3, C 5 ) October 19, 1998 Copyright 1998 by Bill Havens 10 of 26

Discussion Assume culprit = Y: Which value minimizes constraint violations? m How about Min-Conflicts Heuristic for n-ary constraints? How do you count conflicting variables? Two aspects to the repair problem 1. Identify culprit 2. Choose new value for culprit variable. m Max-Conflicts Heuristic 1. Choose variable as culprit which maximizes the number of conflicting constraints. 2. Break ties randomly. 3. Use Min-Conflicts to assign a new value. October 19, 1998 Copyright 1998 by Bill Havens 11 of 26

Iterative Repair Methods Reference: S. Minton et.al., Minimizing Conflicts: a heuristic repair method for constraint satisfaction and scheduling problems, Artificial Intelligence 58, 1992, pp161-205. Introduction Alternative to Constructive (Backtracking) methods Applicable to scheduling and CSPs in general Developed in response to exponential behaviour of constructive techniques. Comparison: 100-queens is very hard for backtracking while 1,000,000-queens is easy for iterative repair! Caveat: n-queens has a solution density which increases with n. October 19, 1998 Copyright 1998 by Bill Havens 12 of 26

Basic Idea 1. Start with a complete (but inconsistent) variable assignment. 2. Heuristically choose some inconsistent variable assignment. 3. Iteratively repair variable assignments until a consistent solution is found. October 19, 1998 Copyright 1998 by Bill Havens 13 of 26

Iterative Repair Methodology ALgorithm Given a point X D 1... D n, in the search space. Call IterSearch(X) IterSearch(X) if goal(x) then return(x) let Y = some neighbour(x) IterSearch(Y) end Justification More effective on many CSPs than constructive methods. Argument: a complete but inconsistent assignment provides more guidance (information) than a partial assignment. October 19, 1998 Copyright 1998 by Bill Havens 14 of 26

Shortcomings Depends on effective notion of neighbourhood. Requires a good objective function over the neighbourhood. Domain independent objective functions are hard to find. Incomplete search. No termination possible. October 19, 1998 Copyright 1998 by Bill Havens 15 of 26

Using Min-Conflicts in Backtrack Search Method Start with complete assignment of variables. Let VarsLeft = variables which have not yet been repaired. Let VarsDone = variables already repaired. Algorithm attempts to repair each variable only once. If no possible consistent assignment in VarsLeft, then backtrack. October 19, 1998 Copyright 1998 by Bill Havens 16 of 26

Informed Backtrack Algorithm Procedure InformedBacktrack (VarsLeft, VarsDone) if consistent assignment then Halt. let Var = some conflicting variable in VarsLeft. VarsLeft := Varsleft - {Var}. VarsDone := VarsDone + {Var}. Let Values = domain of Var sorted by min-conflicts. for each Value in Values until solution found If Value does Not conflict with VarsDone then Var := Value. InformedBacktrack (VarsLeft, VarsDone) end. Begin program Let VarsLeft = all variables in CSP with initial assignments. Let VarsDone = nil. InformedBacktrack (VarsLeft, VarsDone) end program. October 19, 1998 Copyright 1998 by Bill Havens 17 of 26

Experimental Results Ubiquitous n-queens Problem! Evaluated using both Iterative Repair and InformedBacktrack algorithms. n Standard Backtrack MinConflict Repair Informed Backtrack n= 10 1 53.8 57.0 46.8 n= 10 2 4473 (70%) 55.6 25.0 n= 10 3 88650 (13%) 48.8 30.7 n= 10 4 too big 48.5 27.5 n= 10 5 too big 52.8 27.8 n= 10 6 too big 48.3 26.4 October 19, 1998 Copyright 1998 by Bill Havens 18 of 26

Notes: 1. Bound of n x 100 queen movements enforced. 2. Results in parantheses indicate fraction that were successful in move limit. October 19, 1998 Copyright 1998 by Bill Havens 19 of 26

Simulated Annealing Search Reference: S. Kirkpatrick et.al., Optimization by Simulated Annealing, Science 220, #4598, 1983. Overview How can local maxima be avoided? Yet retain the advantages of hill-climbing search. Based on analogy to slow cooling of metal (called annealing ) to allow crystal structures to form properly. m Basic Idea: Use standard hill-climbing but occasionally move in a direction which is not the gradient with a probability that declines over time. This probability is analogous to the temperature T of the cooling metal. October 19, 1998 Copyright 1998 by Bill Havens 20 of 26

Requires a cooling schedule T(time). Pathology: All local search easily gets caught in local maxima. Random noise can avoid this problem. October 19, 1998 Copyright 1998 by Bill Havens 21 of 26

Simulated Annealing is optimal 2Strong result: given slow enough cooling schedule, Simulated Annealing will find the optimal (maximal) solution on any landscape. BUT it may require a very very long time!. Figure 4.7 from Ginsberg October 19, 1998 Copyright 1998 by Bill Havens 22 of 26

Argument Normally the algorithm proceeds uphill along the gradient. With low probability which decreases towards zero over time, the algorithm moves a random distance elswhere. Assume the search space has a smoothly varying evaluation function. Suppose the evaluation function f(p) = m for the true optimum point p in the search space. Every other point x will have f(x) < m. Initially for high temperature T, occasional random moves will cause the current point x to move among all hills. Eventually a transition temperature will be reached where it is no longer possible to move off the largest hill but still possible for all other hills. October 19, 1998 Copyright 1998 by Bill Havens 23 of 26

Given enough random movements at this temperature, the algorithm will eventually climb and remain on the peak of the largest hill October 19, 1998 Copyright 1998 by Bill Havens 24 of 26

GSAT Algorithm Reference: B. Selman, H. Levesque & D. Mitchell (1992) A New Method for Solving Hard Satisfiability Problems, Proc. 10th Nat. Conf. of A.I., pp.440-446. Introduction Heuristic algorithm for binary variable CSPs. Example of the class of Iterative Repair methods. Not systematic. Uses objective function MaxSat which measures the gradient direction of which variable flip will result in the maximum number of Boolean clauses being satisfied. Suffers local maxima problem of all Hill Climbing algorithms. October 19, 1998 Copyright 1998 by Bill Havens 25 of 26

GSAT Algorithm. for i := 1 to MaxTries do P := some random truth assignment. for j := 1 to MaxFlips do if P is consistent then Halt, else flip any assignment in P under MaxSat heuristic. return failure. end. Algorithm explained! October 19, 1998 Copyright 1998 by Bill Havens 26 of 26