Local Search. CS 486/686: Introduction to Artificial Intelligence

Similar documents
Local Search. CS 486/686: Introduction to Artificial Intelligence Winter 2016

CS:4420 Artificial Intelligence

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Introduction to Artificial Intelligence 2 nd semester 2016/2017. Chapter 4: Beyond Classical Search

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Artificial Intelligence

Artificial Intelligence

Local Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat:

Outline. Best-first search. Greedy best-first search A* search Heuristics Local search algorithms

Hill Climbing. Assume a heuristic value for each assignment of values to all variables. Maintain an assignment of a value to each variable.

CMU-Q Lecture 8: Optimization I: Optimization for CSP Local Search. Teacher: Gianni A. Di Caro

Artificial Intelligence

CS 331: Artificial Intelligence Local Search 1. Tough real-world problems

mywbut.com Informed Search Strategies-II

Informed Search. Dr. Richard J. Povinelli. Copyright Richard J. Povinelli Page 1

3.6.2 Generating admissible heuristics from relaxed problems

Beyond Classical Search: Local Search. CMPSCI 383 September 23, 2011

Escaping Local Optima: Genetic Algorithm

ARTIFICIAL INTELLIGENCE. Informed search

Administrative. Local Search!

N-Queens problem. Administrative. Local Search

Informed search algorithms. Chapter 4

TDDC17. Intuitions behind heuristic search. Recall Uniform-Cost Search. Best-First Search. f(n) =... + h(n) g(n) = cost of path from root node to n

Artificial Intelligence p.1/49. n-queens. Artificial Intelligence p.2/49. Initial state: the empty board or a board with n random

Recap Hill Climbing Randomized Algorithms SLS for CSPs. Local Search. CPSC 322 Lecture 12. January 30, 2006 Textbook 3.8

Artificial Intelligence

Heuristic Optimisation

TDT4136 Logic and Reasoning Systems

BEYOND CLASSICAL SEARCH

Wissensverarbeitung. - Search - Alexander Felfernig und Gerald Steinbauer Institut für Softwaretechnologie Inffeldgasse 16b/2 A-8010 Graz Austria

Outline. Informed Search. Recall: Uninformed Search. An Idea. Heuristics Informed search techniques More on heuristics Iterative improvement

Lecture Plan. Best-first search Greedy search A* search Designing heuristics. Hill-climbing. 1 Informed search strategies. Informed strategies

Advanced Search Genetic algorithm

CS 380: Artificial Intelligence Lecture #4

Artificial Intelligence

Beyond Classical Search

Gradient Descent. 1) S! initial state 2) Repeat: Similar to: - hill climbing with h - gradient descent over continuous space

Informed search algorithms. (Based on slides by Oren Etzioni, Stuart Russell)

Iterative improvement algorithms. Today. Example: Travelling Salesperson Problem. Example: n-queens

Genetic Algorithms Variations and Implementation Issues

4 INFORMED SEARCH AND EXPLORATION. 4.1 Heuristic Search Strategies

TDDC17. Intuitions behind heuristic search. Best-First Search. Recall Uniform-Cost Search. f(n) =... + h(n) g(n) = cost of path from root node to n

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

Expert Systems (Graz) Heuristic Search (Klagenfurt) - Search -

Non-deterministic Search techniques. Emma Hart

Local Search (Ch )

Foundations of AI. 4. Informed Search Methods. Heuristics, Local Search Methods, Genetic Algorithms. Wolfram Burgard & Bernhard Nebel

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS

CS5401 FS2015 Exam 1 Key

Downloaded from ioenotes.edu.np

Solving Problems: Intelligent Search

Outline of the module

Set 5: Constraint Satisfaction Problems

Hybridization EVOLUTIONARY COMPUTING. Reasons for Hybridization - 1. Naming. Reasons for Hybridization - 3. Reasons for Hybridization - 2

Artificial Intelligence

2006/2007 Intelligent Systems 1. Intelligent Systems. Prof. dr. Paul De Bra Technische Universiteit Eindhoven

AI Programming CS S-08 Local Search / Genetic Algorithms

Recap Randomized Algorithms Comparing SLS Algorithms. Local Search. CPSC 322 CSPs 5. Textbook 4.8. Local Search CPSC 322 CSPs 5, Slide 1

Foundations of AI. 4. Informed Search Methods. Heuristics, Local Search Methods, Genetic Algorithms

Evolutionary Algorithms. CS Evolutionary Algorithms 1

Foundations of Artificial Intelligence

Midterm Examination CS 540-2: Introduction to Artificial Intelligence

Informed Search. Best-first search. Greedy best-first search. Intelligent Systems and HCI D7023E. Romania with step costs in km

Algorithm Design (4) Metaheuristics

Artificial Intelligence

Artificial Intelligence

Uninformed Search Methods. Informed Search Methods. Midterm Exam 3/13/18. Thursday, March 15, 7:30 9:30 p.m. room 125 Ag Hall

Informed Search and Exploration

Heuristic Optimisation

Using Genetic Algorithms to optimize ACS-TSP

Informed Search and Exploration

Midterm Examination CS540-2: Introduction to Artificial Intelligence

Evolutionary Algorithms: Perfecting the Art of Good Enough. Liz Sander

Ar#ficial)Intelligence!!

Foundations of Artificial Intelligence

Informed Search and Exploration

Local Search for CSPs

Search Algorithms for Regression Test Suite Minimisation

Evolutionary Computation Algorithms for Cryptanalysis: A Study

Set 5: Constraint Satisfaction Problems Chapter 6 R&N

Recap Arc Consistency Example Local Search Hill Climbing. Local Search. CPSC 322 CSPs 4. Textbook 4.8. Local Search CPSC 322 CSPs 4, Slide 1

Vorlesung Grundlagen der Künstlichen Intelligenz

Lecture 4. Convexity Robust cost functions Optimizing non-convex functions. 3B1B Optimization Michaelmas 2017 A. Zisserman

Informed search algorithms. Chapter 4

Solving Problems using Search

Constraint Satisfaction Problems (CSPs) Lecture 4 - Features and Constraints. CSPs as Graph searching problems. Example Domains. Dual Representations

Introduction to Computer Science and Programming for Astronomers

Today. CS 188: Artificial Intelligence Fall Example: Boolean Satisfiability. Reminder: CSPs. Example: 3-SAT. CSPs: Queries.

DIT411/TIN175, Artificial Intelligence. Peter Ljunglöf. 23 January, 2018

Robot Programming with Lisp

Contents. Foundations of Artificial Intelligence. General Algorithm. Best-First Search

Introduction to Genetic Algorithms. Based on Chapter 10 of Marsland Chapter 9 of Mitchell

(Stochastic) Local Search Algorithms

Lecture: Iterative Search Methods

Informed Search. Topics. Review: Tree Search. What is Informed Search? Best-First Search

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery

Genetic Algorithms and Genetic Programming Lecture 7

Lecture 6: The Building Block Hypothesis. Genetic Algorithms and Genetic Programming Lecture 6. The Schema Theorem Reminder

Transcription:

Local Search CS 486/686: Introduction to Artificial Intelligence 1

Overview Uninformed Search Very general: assumes no knowledge about the problem BFS, DFS, IDS Informed Search Heuristics A* search and variations Search and Optimization What are the problem features? Iterative improvement: hill climbing, simulated annealing Genetic algorithms 2

Introduction Both uninformed and informed search systematically explore the search space Keep 1 or more paths in memory Solution is a path to the goal For many problems, the path is unimportant 3

Examples AV ~B V C ~A V C V D B V D V ~E ~C V ~D V ~E 4

Informal Characterization Combinatorial structure being optimized Constraints have to be satisfied There is a cost function We want to find a good solution Search all possible states is infeasible Often easy to find some solution to the problem Often provably hard (NPcomplete) to find the best solution 5

Typical Example: TSP Goal is to minimize the length of the route Constructive method: Start from scratch and build up a solution Iterative improvement method: Start with solution (may be suboptimal or broken) and improve it 6

Constructive Methods For the optimal solution we can use A* But... We do not need to know how we got the solution We just want the solution 7

Iterative Improvement Methods Idea: Imagine all possible solutions laid out on a landscape Goal: find the highest (or lowest) point 8

Iterative Improvement Methods Start at some random point Generate all possible points to move to If the set is not empty, choose a point and move to it If you are stuck (set is empty), then restart 9

Iterative Improvement Methods What does it mean to generate points to move to Generating the moveset Depends on the application TSP 2swap 10

Hill Climbing (Gradient Descent) Main idea Always take a step in the direction that improves the current solution value the most Variation of bestfirst search Very popular for learning algorithms like trying to find the top of Mt Everest in a thick fog while suffering from amnesia, Russell and Norvig 11

Hill Climbing 1.Start with some initial configuration S 2.Let V be the value of S 3.Let Si, i=1,...,n be neighbouring configs, Vi are corresponding values 4.Let Vmax=maxi Vi be value of best config and Smax is the corresponding config If V max<v return S (local optimium) Let S S max and V Vmax. Go to 3 12

Judging Hill Climbing Good news Easy to program Requires no memory of where we have been Important to have a good set of moves Not too many, not too few 13

Judging Hill Climbing Bad news It can get stuck Local maxima/minima Plateaus 14

Improving Hill Climbing Plateaus Allow for sideways moves But be careful since might move sideways forever Local Maxima Random restarts: If at first you do not succeed, try, try again! 15

Randomized Hill Climbing Like hill climbing except You choose a random state from the move set Move to it if it is better than current state Continue until you are bored 16

More Randomization Hill climbing is incomplete can get stuck at local optima A random walk is complete but very inefficient New Idea: Allow the algorithm to make some bad moves in order to escape local optima 17

Example: GSAT AV~BVC 1 ~AVCVD 1 BVDV~E 0 ~CV~DV~E 1 ~AV~CVE 1 Configuration A=1, B=0, C=1, D=0, E=1 Goal is to maximize the number of satisfied clauses: Eval(config)=# satisfied clauses GSAT Move_Set: Flip any 1 variable WALKSAT (Randomized GSAT) Pick a random unsatisfied clause; Consider flipping each variable in the clause If any improve Eval, then accept the best If none improve Eval, then with prob p pick the move that is least bad; prob (1p) pick a random one 18

Simulated Annealing 1.S is initial config and V=Eval(S) 2.Let i be a random move from the moveset and let Si be the next config, Vi=Eval(Si) 3.If V<Vi, then S=Si and V=Vi 4.Else with probability p, S=Si and V=Vi 5.Go to 2 until you are bored 19

What About p? How should we choose the probability of making a bad move? p=0.1 (or some fixed value)? Decrease p with time? Decrease p with time and as VV i increases?... 20

Selecting Moves in Simulated Annealing If new value V i is better than old value V then definitely move to new solution If new value V i is worse than old value V then move to new solution with probability Boltzmann Distribution: T>0 is a parameter called temperature. It starts high and decreases over time towards 0. If T is close to 0 then the prob. of making a bad move is almost 0. 21

Properties to Simulated Annealing When T is high: Exploratory phase: even bad moveshave a chance of being picked (random walk) When T is low: Exploitation phase: bad moves have low probability of being chosen (randomized hill climbing) If T is decreased slowly enough then simulated annealing is guaranteed to reach optimal solution 22

Genetic Algorithms Populations are encoded into a representation which allows certain operations to occur Usually a bitstring Representation is key needs to be thought out carefully An encoded candidate solution is an individual Each individual has a fitness Numerical value associated with its quality of solution A population is a set of individuals Populations change over generations by applying operators to them Operations: selection, mutation, crossover 23

Typical Genetic Algorithm Initialize: Population P N random individuals Evaluate: For each x in P, compute fitness(x) Loop For i=1 to N Select 2 parents each with probability proportional to fitness scores Crossover the 2 parents to prodice a new bitstring (child) With some small probability mutate child Add child to population Until some child is fit enough or you get bored Return best child in the population according to fitness function 24

Selection Fitness proportionate selection: Can lead to overcrowding Tournament selection Pick i, j at random with uniform probability With probability p select fitter one Rank selection Sort all by fitness Probability of selection is proportional to rank Softmax (Boltzmann) selection: 25

Crossover Combine parts of individuals to create new ones For each pair, choose a random crossover point Cut the individuals there and swap the pieces 101 0101 011 1110 Cross over 011 0101 101 1110 Implementation: use a crossover mask m Given two parents a and b the offspring are (a^m)v(b^~m) and (a^~m)v (b^m) 26

Mutation Mutation generates new features that are not present in original population Typically means flipping a bit in the string 100111 mutates to 100101 Can allow mutation in all individuals or just in new offspring 27

Example 28

Summary Useful for optimization problems Often the secondbest way to solve a problem If you can, use A* or linear programming or... Need to think about how to escape from local optima Random restarts Allowing for bad moves... 29