Heuristic Evaluation Function. 11. Evaluation. Issues. Admissible Evaluations. Jonathan Schaeffer

Similar documents
Moving On. 10. Single-agent Search. Applications. Why Alpha-Beta First?

Lecture 5 Heuristics. Last Time: A* Search

What is a Pattern Database? Pattern Databases. Success Story #1. Success Story #2. Robert Holte Computing Science Dept. University of Alberta

Class Overview. Introduction to Artificial Intelligence COMP 3501 / COMP Lecture 2. Problem Solving Agents. Problem Solving Agents: Assumptions

Outline. Where Do Heuristics Come From? Part 3. Question. Max ing Multiple Heuristics

Informed search algorithms. Chapter 3 (Based on Slides by Stuart Russell, Dan Klein, Richard Korf, Subbarao Kambhampati, and UW-AI faculty)

Informed Search Methods

Partial Pattern Databases

Recursive Best-First Search with Bounded Overhead

Today s s lecture. Lecture 3: Search - 2. Problem Solving by Search. Agent vs. Conventional AI View. Victor R. Lesser. CMPSCI 683 Fall 2004

Multiple Pattern Databases

CS 188: Artificial Intelligence. Recap Search I

Informed Search. CS 486/686 University of Waterloo May 10. cs486/686 Lecture Slides 2005 (c) K. Larson and P. Poupart

Solving Sokoban Optimally using Pattern Databases for Deadlock Detection

Trees and DAGS. 4. Trees and DAGs. Cycles. Cycles

Finding Optimal Solutions to Sokoban Using Instance Dependent Pattern Databases

Compressing Pattern Databases

Chapters 3-5 Problem Solving using Search

Informed Search Algorithms

1 Introduction and Examples

Informed search algorithms

Informed search methods

Heuristic Search and Advanced Methods

Artificial Intelligence Informed search. Peter Antal

Problem Solving and Search

Artificial Intelligence

CPS 170: Artificial Intelligence Search

Iterative-Expansion A*

Outline. Best-first search

Informed (Heuristic) Search. Idea: be smart about what paths to try.

Ar#ficial)Intelligence!!

CS 331: Artificial Intelligence Informed Search. Informed Search

Searching with Partial Information

A.I.: Informed Search Algorithms. Chapter III: Part Deux

CS 331: Artificial Intelligence Informed Search. Informed Search

Learning from Multiple Heuristics

Outline. Best-first search

Heuristic (Informed) Search

Fringe Search: Beating A* at Pathfinding on Game Maps

Lecture 9. Heuristic search, continued. CS-424 Gregory Dudek

SRI VIDYA COLLEGE OF ENGINEERING & TECHNOLOGY REPRESENTATION OF KNOWLEDGE PART A

Informed search strategies (Section ) Source: Fotolia

Advanced A* Improvements

ITCS 6150 Intelligent Systems. Lecture 5 Informed Searches

Iterative-Expansion A*

Informed Search and Exploration

Basic Search Algorithms

Enhanced Partial Expansion A*

Heuristic Search. Heuristic Search. Heuristic Search. CSE 3401: Intro to AI & LP Informed Search

CS 520: Introduction to Artificial Intelligence. Lectures on Search

Ultra-Fast Optimal Pathfinding without Runtime Search

Heuris'c Search. Reading note: Chapter 4 covers heuristic search.

Logistics. Heuristics & Constraint Satisfaction. Symbolic Integration. 573 Topics. Symbolic Integration. Search. E.g. x 2 e x dx = e x (x 2-2x+2) + C

Additive Pattern Database Heuristics

Informed Search (Ch )

Artificial Intelligence Informed search. Peter Antal Tadeusz Dobrowiecki

CS 4700: Artificial Intelligence

Abstraction-Based Heuristics with True Distance Computations

Heuristic Search INTRODUCTION TO ARTIFICIAL INTELLIGENCE BIL 472 SADI EVREN SEKER

Early Work on Optimization-Based Heuristics for the Sliding Tile Puzzle

Class Overview. Introduction to Artificial Intelligence COMP 3501 / COMP Lecture 2: Search. Problem Solving Agents

CS188: Artificial Intelligence, Fall 2008

Search. CS 3793/5233 Artificial Intelligence Search 1

Are We Smart Yet? (Where we try to choose smartly) R&N: Chap. 4, Sect

Lecture 3 - States and Searching

Search. (Textbook Chpt ) Computer Science cpsc322, Lecture 2. May, 10, CPSC 322, Lecture 2 Slide 1

CS:4420 Artificial Intelligence

COMP9414/ 9814/ 3411: Artificial Intelligence. 5. Informed Search. Russell & Norvig, Chapter 3. UNSW c Alan Blair,

Predicting the Performance of IDA* using Conditional Distributions

Informed search methods

Informed (heuristic) search (cont). Constraint-satisfaction search.

University of Alberta. Levi Henrique Santana de Lelis. Doctor of Philosophy. Department of Computing Science

4 INFORMED SEARCH AND EXPLORATION. 4.1 Heuristic Search Strategies

Informed search algorithms. Chapter 4, Sections 1 2 1

Lecture 2: Fun with Search. Rachel Greenstadt CS 510, October 5, 2017

Improving the Efficiency of Depth-First Search by Cycle Elimination

Search and Games. Adi Botea. ANU Summer Schools in Logic and Learning February, 2009

Informed Search. CS 486/686: Introduction to Artificial Intelligence Fall 2013

Introduction to Artificial Intelligence. Informed Search

Notes. Video Game AI: Lecture 5 Planning for Pathfinding. Lecture Overview. Knowledge vs Search. Jonathan Schaeffer this Friday

Heuristic Search: A* CPSC 322 Search 4 January 19, Textbook 3.6 Taught by: Vasanth

Heuristic Search. Robert Platt Northeastern University. Some images and slides are used from: 1. CS188 UC Berkeley 2. RN, AIMA

Princess Nora University Faculty of Computer & Information Systems ARTIFICIAL INTELLIGENCE (CS 370D) Computer Science Department

Using Lookaheads with Optimal Best-First Search

Solving the Sokoban Problem (Artificial Intelligence 11)

The Big Idea. Where Do Heuristics Come From? Puzzles. Applications. (Using Abstraction to Speed Up Search)

DIT411/TIN175, Artificial Intelligence. Peter Ljunglöf. 23 January, 2018

CSC384 Midterm Sample Questions

Informed Search and Exploration

Integer Programming ISE 418. Lecture 7. Dr. Ted Ralphs

Informed Search and Exploration

Foundations of Artificial Intelligence

Exam Topics. Search in Discrete State Spaces. What is intelligence? Adversarial Search. Which Algorithm? 6/1/2012

Bounded Suboptimal Heuristic Search in Linear Space

Heuristic Search. Rob Platt Northeastern University. Some images and slides are used from: AIMA

Does Fast Beat Thorough? Comparing RTAA* and LSS-LRTA*

Partial-expansion A* with Selective Node Generation

Uninformed Search Methods. Informed Search Methods. Midterm Exam 3/13/18. Thursday, March 15, 7:30 9:30 p.m. room 125 Ag Hall

Informed Search. Notes about the assignment. Outline. Tree search: Reminder. Heuristics. Best-first search. Russell and Norvig chap.

APHID: Asynchronous Parallel Game-Tree Search

Transcription:

Heuristic Evaluation Function. Evaluation Jonathan Schaeffer jonathan@cs.ualberta.ca www.cs.ualberta.ca/~jonathan Most of the magic in a single-agent searcher is in the evaluation function To obtain an optimal answer, we need an admissible lower bound Search tree size is strongly tied to the quality of the evaluation function Unlike Alpha-beta where the evaluation function influenced the quality of the answer, but not really the size of the search //0 Issues How do we obtain an admissible evaluation function? How do we improve the quality of the evaluation function? What happens with a non-admissible heuristic? Admissible Evaluations Consider a relaxed version of the application Eliminate a rule to simplify the calculation of the heuristic distance An exact solution to a relaxed problem is usually a good heuristic for the original problem //0 //0

Example Consider Manhattan Distance for pathfinding Original problem Move man to goal subject to obstacles Relaxed problem Move man to goal assuming no obstacles Methodology Define the problem formally Remove one of the restrictions Evaluate whether the resulting problem is easy to evaluate and whether the results are worthwhile Could be automated, although this is an ongoing research topic //0 //0 Multiple Heuristics There may be multiple good heuristics, each of which performs better in different circumstances Given N admissible heuristics, could compose a new, more powerful heuristic: H = MAX( h, h, h, h N ) Heuristic Evaluation Right now, best way is to do this by hand; automated techniques are still in their infancy Apply application-dependent knowledge to decide on a heuristic(s) //0 //0

Pattern Database [] Endgame databases are a perfect lower bound for a (small) subset of positions Pattern databases computer lower bounds for subsets (patterns) of a state Using extra data, can improve the lower bound estimate Pattern Databases Define a subset of the state Enumerate all possibilities for that subset and pre-compute optimal distances to solving that relaxed problem The larger the subset the more effective the heuristic //0 //0 0 Using a Pattern Database B B Pre-compute the minimum number of moves to achieve a subset of the goal state. Using a Pattern Database Many real-world problems have symmetries that can be exploited -puzzle symmetry reflect horizontally and vertically reflect along all four axis use the maximum of all lower bounds //0 //0

-puzzle: H0 Simplest heuristic evaluation function Value = 0 if a goal node Value = if not a goal node -puzzle: H Count the number of misplaced tiles Assumes cost of placing a misplaced tile is 0 H = 0 H = //0 //0 -puzzle: H Manhattan Distance Count number of horizontal and vertical moves to place each tile -puzzle: H Add in linear conflicts Two tiles in a row/column that have to swap positions ( and ) 0 H = Search = 0,0 nodes 0 H = 0 //0 //0

-puzzle: H Pattern databases Use the pattern database shown earlier 0 H = Search (F) =,0 nodes Search(C) =, nodes Experiments () Standard set of 00 test positions Korf, A* quickly runs out of memory ( MB) can only solve the easy problems Must use iterative deepening to reduce storage needs //0 //0 Experiments () Experiments () IDA* linear storage in search depth (0 MB) Transposition table entries, 0 bytes per entry ( MB) Endgame database all positions <= moves of the goal (0 MB) Pattern database all subsets of tiles (00 MB) //0 IDA* + TT + DB + TT + DB + PDB + TT+DB+PDB,0,0,0 00.00,,,000.,,,0.,,,.,, 0.0,, 0.0 0-fold improvement! //0 0

Lessons Learned Eliminate unnecessary work Any improvements to a state s evaluation will pay enormous benefits in the search space/time tradeoffs e.g. linear conflicts: a subset of pattern databases but without the storage and runtime costs Space/Time Korf has a hypothesis that there is a linear relationship between execution time and storage used Extreme cases: 0 storage and complete storage In between, roughly doubling your storage can be used to reduce the tree by roughly a factor of two //0 //0 Non-admissible Heuristics Admissible heuristics guarantee optimality Non-admissible heuristics are OK as long as you do not mind the possibility of non-optimal solutions WIDA* Multiple the h value by a small constant >.0 This has the effect of concentrating search on paths where the h value is small For many applications, this results in (near)-optimal solutions, but with a much improved execution time //0 //0

References [] J. Culberson and J. Schaeffer. Pattern Databases, Computational Intelligence, vol., no., pp. -,. //0