Topics Informed Search Best-First Search Greedy Search A* Search Sattiraju Prabhakar CS771: Classes,, Wichita State University 3/6/2005 AI_InformedSearch 2 Review: Tree Search What is Informed Search? The search strategies use problem specific knowledge, in addition to the problem definition Examples: In Romania Touring Example: Problem Specific knowledge is the distance of a city from Bucharest (goal state) In 8-Puzzle problem: Problem specific knowledge is how many tiles are out of place with the goal state Uninformed search strategies are inefficient Informed searches can be very efficient 3/6/2005 AI_InformedSearch 3 3/6/2005 AI_InformedSearch 4 1
Map of Romania Romania Distances Arad Bucharest Craiova Dobreta Eforie Fagaras Giurgiu Hirsova Iasi Lugoj 366 0 0 242 1 176 77 1 226 244 Mehadia Neamt Oradea Pitesti Rimnicu Vilcea Sibiu Timisoara Urziceni Vaslui Zerind 241 234 380 100 193 253 329 80 199 374 3/6/2005 AI_InformedSearch 5 3/6/2005 AI_InformedSearch 6 General Approach: Best-first Search Greedy Search 3/6/2005 AI_InformedSearch 7 3/6/2005 AI_InformedSearch 8 2
What is Greedy Search? Greedy Search Example 3/6/2005 AI_InformedSearch 9 3/6/2005 AI_InformedSearch 10 Greedy Search Example Greedy Search Example 3/6/2005 AI_InformedSearch 11 3/6/2005 AI_InformedSearch 3
Greedy Search Example Modifying Tree Search Algorithm INSERT-ALL: Before insert, the fringe is already sorted over h(n) (Consider the queue is an array, A f ) From the expansion of popped out node, create an array of children nodes (A c ) Combine these two arrays to result in a new array that is sorted over h(n) It is the new fringe Retrieving h(n) To the tree search, now a table of h(n) values for each state is available When you expand a node, you need to retrieve the h(n) values 3/6/2005 AI_InformedSearch 3/6/2005 AI_InformedSearch IMPORTANT! Admissible Heuristics Performance of your algorithm depends upon the heuristic function It can improve or decrease based on the type of heuristic function The best heuristic function needs to try only one node in all the children of a node The worst heuristic function will try all the children of a node Equivalent to depth-first search Your need to specify heuristic function The algorithm cannot figure out the heuristic function by itself 3/6/2005 AI_InformedSearch 3/6/2005 AI_InformedSearch 4
Tutorial Properties of Greedy Search For the following two examples write their h(n) functions and then apply the greedy search algorithm 8-Puzzle problem Blocks World Problem 3/6/2005 AI_InformedSearch 17 3/6/2005 AI_InformedSearch 18 A* Search Basic Idea 3/6/2005 AI_InformedSearch 19 3/6/2005 AI_InformedSearch 20 5
A* Search Example A* Search Example 3/6/2005 AI_InformedSearch 21 3/6/2005 AI_InformedSearch 22 A* Search Example A* Search Example 3/6/2005 AI_InformedSearch 23 3/6/2005 AI_InformedSearch 24 6
A* Search Example A* Search Example 3/6/2005 AI_InformedSearch 25 3/6/2005 AI_InformedSearch 26 Algorithm Tutorial You need to modify Greed Search by using f(n) = g(n) + h(n) in place of h(n) For the following problems apply A* algorithm 8-Puzzle problem 3/6/2005 AI_InformedSearch 27 3/6/2005 AI_InformedSearch 28 7
Properties of A* Local Search 3/6/2005 AI_InformedSearch 29 3/6/2005 AI_InformedSearch 30 Topics Local Search Hill-Climbing Search Genetic Algorithms Two types of problems: Type1: Solutions with sequences of actions Type2: Solutions that show that goal has reached, without a need for path Examples; 8-queens problem Problem of Type1: Informed and Uninformed searches explore search spaces systematically We keep the search paths in memory We record which alternatives we searched and which we have not Problems of Tyep2: Local Search 3/6/2005 AI_InformedSearch 31 3/6/2005 AI_InformedSearch 32 8
Applications of Local Search Characteristics of Local Search I Integrated-circuit design Factory-floor layout Job-shop scheduling Automated Programming Telecommunications Network Optimization Vehicle Routing Portfolio Management Not concerned with paths in the tree. These algorithms start at a current state Explore only in the neighborhood Paths followed by search are not retained Advantages: They use very little memory They can find solutions in large or infinite state spaces 3/6/2005 AI_InformedSearch 33 3/6/2005 AI_InformedSearch 34 Characteristics of Local Search II One Dimensional State Space Landscape Search satisfying an objective function: Objective function specifies constraint on search Examples: Optimization, fitness function Search Space Landscapes: They have both a location and elevation Location: Current state Elevation: Cost (global minimum) Objective: To explore the landscape to find a goal (Completeness) To find a global minimum or a global maximum (Optimality) objective function shoulder global maximum current state local maximum flat local maximum state space 3/6/2005 AI_InformedSearch 35 3/6/2005 AI_InformedSearch 36 9
Hill-Climbing Search Algorithm Goal is at the peak of hill Terminates when it reaches peak Search strategy: to move up the hill Does not maintain a search tree Records only the state and its value for objective function 3/6/2005 AI_InformedSearch 37 3/6/2005 AI_InformedSearch 38 Algorithm 8-queens example Complete State Formulation: Description has all aspects Example: All queens are on board one per column Successor Function: Generates all states following a local condition Example: Moving a queen from a location to another in the same column 8 columns and 7 positions = 56 successors Heuristic Cost Function (h): Number of pairs of queens that are attacking each other Global minimum is zero 3/6/2005 AI_InformedSearch 39 3/6/2005 AI_InformedSearch 40 10
8-Queens Problem 8-queens for Hill Climbing 18 17 18 17 18 8-queens state with h = 17 18 17 18 3/6/2005 AI_InformedSearch 41 3/6/2005 AI_InformedSearch 42 8-Queens Example: Local Minima Analysis State has h = 1 It is a greedy local search Does not think ahead Can get stuck: Local Maxima: Several maxima Ridges: Sequence of local maxima Plateau: Flat area of objective function 3/6/2005 AI_InformedSearch 43 3/6/2005 AI_InformedSearch 44 11
Local Maxima Genetic Algorithms 3/6/2005 AI_InformedSearch 45 3/6/2005 AI_InformedSearch 46 Local Beam Search Characteristics of Genetic Algorithms (1) Keeps track of k states rather than one state Begins with k randomly generated states At each step the successors of all k states are generated If any one is a goal, the algorithm halts. Otherwise, it selects k best successors from the complete list and repeats State Space: Individuals and Populations: Each state is an individual A set of states the search strategy keeps at each stage is a population Representing States: A string over a finite alphabet Successor Function: Combining two current states (parents) produces a successor state 3/6/2005 AI_InformedSearch 47 3/6/2005 AI_InformedSearch 48
Characteristics of Genetic Algorithm (2) Population and Its changes Actions: Crossover: Part of a state is combined with part of another state Mutation: To randomly change the value of an element of string of the state Objective Function: 24748552 32752411 2444 325432 24 23 20 11 31% 29% 26% % 32752411 24748552 32752411 2444 32748552 24752411 327524 244411 327482 24752411 322524 244417 Fitness Function: A measure that can assign a number which can be used to compare different states (a) Initial Population (b) Fitness Function Solution: h(n) = 28 (c) Selection (d) Crossover (e) Mutation 3/6/2005 AI_InformedSearch 49 3/6/2005 AI_InformedSearch 50 8-quuens states for crossover Algorithm (1) + = function GENETIC-ALGORITHM(population, FITNESS-FN) returns an individual inputs: population, a set of individuals FITNESS-FN, a function that measures the fitness of an individual repeat new-population empty set (Continued on next page) 3/6/2005 AI_InformedSearch 51 3/6/2005 AI_InformedSearch 52
Algorithm (2) Algorithm (3) loop for i from 1 to SIZE(population) do x RANDOM-SELECTION (population, FITNESS-FN) Y RANDOM-SELECTION (population, FITNESS-FN) child REPRODUCE(x, y) if (small random probability) then child MUTATE(child) add child to new-population until some individual is fit enough, or enough time has elapsed return the best individual in population, according to FITNESS-FN function REPRODUCE(x, y) returns an individual inputs: x, y, parent individuals n LENGTH(x) c random number from 1 to n return APPEND (SUBSTRING (x, 1, c) SUBSTRING(y, c+1, n)) 3/6/2005 AI_InformedSearch 53 3/6/2005 AI_InformedSearch 54