CS5401 FS2015 Exam 1 Key

Similar documents
Mutations for Permutations

Genetic Algorithms. Chapter 3

Hybridization EVOLUTIONARY COMPUTING. Reasons for Hybridization - 1. Naming. Reasons for Hybridization - 3. Reasons for Hybridization - 2

METAHEURISTICS Genetic Algorithm

Genetic Algorithms. Kang Zheng Karl Schober

Artificial Intelligence Application (Genetic Algorithm)

CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS

Introduction to Optimization

Introduction to Optimization

Computational Intelligence

Introduction to Evolutionary Computation

CS348 FS Solving NP-Complete Light Up Puzzle

Geometric Semantic Genetic Programming ~ Theory & Practice ~

Pseudo-code for typical EA

Neural Network Weight Selection Using Genetic Algorithms

Escaping Local Optima: Genetic Algorithm

Outline. CS 6776 Evolutionary Computation. Numerical Optimization. Fitness Function. ,x 2. ) = x 2 1. , x , 5.0 x 1.

Information Fusion Dr. B. K. Panigrahi

Advanced Search Genetic algorithm

CS5401 FS Solving NP-Complete Light Up Puzzle

Introduction to Genetic Algorithms. Based on Chapter 10 of Marsland Chapter 9 of Mitchell

Adaptive Elitist-Population Based Genetic Algorithm for Multimodal Function Optimization

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you?

Heuristic Optimisation

Preliminary Background Tabu Search Genetic Algorithm

Optimizing Flow Shop Sequencing Through Simulation Optimization Using Evolutionary Methods

METAHEURISTIC. Jacques A. Ferland Department of Informatique and Recherche Opérationnelle Université de Montréal.

CHAPTER 4 FEATURE SELECTION USING GENETIC ALGORITHM

The Binary Genetic Algorithm. Universidad de los Andes-CODENSA

Chapter 9: Genetic Algorithms

What is GOSET? GOSET stands for Genetic Optimization System Engineering Tool

The Genetic Algorithm for finding the maxima of single-variable functions

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

Multi-Objective Optimization Using Genetic Algorithms

arxiv: v1 [cs.ne] 5 Jan 2013

COMP SCI 5401 FS Iterated Prisoner s Dilemma: A Coevolutionary Genetic Programming Approach

Evolutionary Computation for Combinatorial Optimization

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

Lecture 6: Genetic Algorithm. An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved

Local Search. CS 486/686: Introduction to Artificial Intelligence Winter 2016

Using Genetic Algorithms to Solve the Box Stacking Problem

Evolutionary Computation Part 2

[Premalatha, 4(5): May, 2015] ISSN: (I2OR), Publication Impact Factor: (ISRA), Journal Impact Factor: 2.114

Genetic Algorithm Performance with Different Selection Methods in Solving Multi-Objective Network Design Problem

A Comparison of the Iterative Fourier Transform Method and. Evolutionary Algorithms for the Design of Diffractive Optical.

Genetic Algorithms for Vision and Pattern Recognition

An Introduction to Evolutionary Algorithms

Local Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat:

Local Search. CS 486/686: Introduction to Artificial Intelligence

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search

Genetic Algorithms Variations and Implementation Issues

Introduction to Optimization

1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra

Dr.-Ing. Johannes Will CAD-FEM GmbH/DYNARDO GmbH dynamic software & engineering GmbH

Genetic Algorithms Presented by: Faramarz Safi (Ph.D.) Faculty of Computer Engineering Islamic Azad University, Najafabad Branch.

A motivated definition of exploitation and exploration

V.Petridis, S. Kazarlis and A. Papaikonomou

Evolutionary Optimization of Filter Parameters for Image Segmentation

NICHING EVOLUTIONARY ALGORITHMS FOR MULTIMODAL AND DYNAMIC OPTIMIZATION

Homework 2: Search and Optimization

CHAPTER 5 ENERGY MANAGEMENT USING FUZZY GENETIC APPROACH IN WSN

Genetic Programming Prof. Thomas Bäck Nat Evur ol al ut ic o om nar put y Aling go rg it roup hms Genetic Programming 1

Introduction to Genetic Algorithms. Genetic Algorithms

Genetic Programming. and its use for learning Concepts in Description Logics

Artificial Intelligence

JavaEvA A Java based framework for Evolutionary Algorithms - Manual and Documentation - Felix Streichert and Holger Ulmer Tech. Report WSI

Review: Final Exam CPSC Artificial Intelligence Michael M. Richter

March 19, Heuristics for Optimization. Outline. Problem formulation. Genetic algorithms

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

CHAPTER 3.4 AND 3.5. Sara Gestrelius

Artificial Intelligence

The k-means Algorithm and Genetic Algorithm

Time Complexity Analysis of the Genetic Algorithm Clustering Method

AI Programming CS S-08 Local Search / Genetic Algorithms

Topological Machining Fixture Layout Synthesis Using Genetic Algorithms

Applied Cloning Techniques for a Genetic Algorithm Used in Evolvable Hardware Design

Evolving SQL Queries for Data Mining

DERIVATIVE-FREE OPTIMIZATION

Constrained Functions of N Variables: Non-Gradient Based Methods

Evolutionary Algorithms Encoding, Fitness, Selection

An Evolutionary Algorithm for the Multi-objective Shortest Path Problem

Evolutionary Algorithms. CS Evolutionary Algorithms 1

A New Selection Operator - CSM in Genetic Algorithms for Solving the TSP

The aims of this tutorial

A Web-Based Evolutionary Algorithm Demonstration using the Traveling Salesman Problem

EVOLUTIONARY OPTIMIZATION OF A FLOW LINE USED ExtendSim BUILT-IN OPTIMIZER

Dynamic Representations and Escaping Local Optima: Improving Genetic Algorithms and Local Search

Ant Colony Optimization (ACO) [Dorigo 1992] Population Based Search. Core Pseudo-Code for Each Ant. ACO Parameters

Path Planning Optimization Using Genetic Algorithm A Literature Review

Hill Climbing. Assume a heuristic value for each assignment of values to all variables. Maintain an assignment of a value to each variable.

Lab 4: Evolutionary Computation and Swarm Intelligence

Hardware Neuronale Netzwerke - Lernen durch künstliche Evolution (?)

THE Multiconstrained 0 1 Knapsack Problem (MKP) is

Genetic Algorithms: Setting Parmeters and Incorporating Constraints OUTLINE OF TOPICS: 1. Setting GA parameters. 2. Constraint Handling (two methods)

Incorporation of Scalarizing Fitness Functions into Evolutionary Multiobjective Optimization Algorithms

Adaptive Crossover in Genetic Algorithms Using Statistics Mechanism

Multi-objective Optimization

Genetic Programming. Charles Chilaka. Department of Computational Science Memorial University of Newfoundland

Introduction to Design Optimization: Search Methods

Measuring Mobility and the Performance of Global Search Algorithms

Transcription:

CS5401 FS2015 Exam 1 Key This is a closed-book, closed-notes exam. The only items you are allowed to use are writing implements. Mark each sheet of paper you use with your name and the string cs5401fs2015 exam1. If you are caught cheating, you will receive a zero grade for this exam. The max number of points per question is indicated in square brackets after each question. The sum of the max points for all the questions is 60, but note that the max exam score will be capped at 58 (i.e., there are 2 bonus points, but you can t score more than 100%). You have exactly 75 minutes to complete this exam. Keep your answers clear and concise while complete. Good luck! Multiple Choice Questions - write the letter of your choice on your answer paper Alice and Bob have been working on a wonderful assignment about MAXSAT and have each implemented an EA to solve this challenging problem. Each bit in their genotype indicates whether the corresponding boolean variable is true or false. Let s call Alice s EA EA-A and Bob s EA EA-B. Note that l indicates the length of the bit string representing a solution for this problem. They decide to compare their EAs. Following are their configurations, a results plot, and several associated questions. Operators/parameters EA-A EA-B Initialization Uniform random Uniform random Parent selection k-tournament w/ k = 0.5 µ binary-tournament Recombination Uniform crossover 3-point crossover Mutation Bit flip with 0.5 bitwise Bit flip with number of bits flipping, uniform mutation chance randomly selected from [0, 0.5 l] Survival type (µ, λ)-strategy (µ + λ)-strategy Survival selection Fitness proportional Truncation Termination 10K evals 10K evals µ 20 50 λ 40 10 Number of runs 30 30

The green lines are for EA-A and the red lines are for EA-B. The dotted lines indicate average population fitness averaged over 30 runs. The solid lines indicate best population fitness averaged over 30 runs. The box plots for the final populations indicate variation (the upper whisker is the best of the runs, the lower whisker is the worst of the runs, the bottom of the box is the first quartile and the top of the box is the third quartile). 1. A MAXSAT problem with the neighborhood structure imposed by EA-A is: [2] (a) unimodal with the possibility of a plateau at the global optimum [true, as it s possible for any subset of bits to flip simultaneously, thus making all global optima neighbors of every other point in the search space, including each other, hence the possibility of a plateau] (b) unimodal with exactly two global optima [1] (false, because in general we cannot know how many global optima there are for this problem) (c) multimodal with at least two local optima [ 1 2 ] (false, as this problem with the neighborhood structure imposed by EA-B is unimodal) (d) multimodal with no more than one local optimum [0] (false, both for this problem and in general per the definition of multimodal) 2. Is the genotypic encoding: [2] (a) pleitropic but not polygenetic [1] (false, as it is not pleiotropic; one bit in the genotype corresponds with the truth value of one paired boolean variable in the phenotype) (b) polygenetic but not pleitropic [1] (false, as it is not polygenetic; one bit in the genotype corresponds with the truth value of one paired boolean variable (c) pleitropic and polygenetic [0] (d) none of the above 3. Is the genotype-phenotype decoding function: [2] (a) surjective but not injective [1] (false, because it s injective; the specified genotype is identical to the phenotype, which is the direct representation of the trial solutions) (b) injective but not surjective [1] (false, because it s surjective; the specified genotype is identical to the phenotype, which is the direct representation of the trial solutions) (c) bijective (d) none of the above [0] 4. Is the phenotype to fitness mapping: [2] (a) surjective but not injective [1] (false, because it s not surjective; e.g., the MAXSAT version of an unsatisfiable SAT problem has an unreachable fitness value equal to the number of clauses) (b) injective but not surjective [1] (false, because it s not injective; i.e., there exist MAXSAT instances where multiple unique solutions satisfy an equal number of clauses) (c) bijective [0] (d) none of the above

5. Based on this results plot, Alice proclaims EA-A the better configuration for addressing MAXSAT in the future when the number of evals is immaterial and the use case is finding the best solution over all runs; is she, statistically speaking, correct? [2] (a) no, because EA-A has a much higher variation at convergence than EA-B, so its higher average best fitness is not statistically relevant [1] (b) yes, because the first quartile of the average best of EA-A is higher than anything that EA-B found (c) no, because EA-A is non-elitist and is therefore not guaranteed to preserve the best solution found during a run [ 1 2 ] (d) no, because the average population fitness averaged over all runs of EA-A is lower than EA-B. and this measure is more predictive of future performance [ 1 2 ] 6. Which of the following configuration changes has a high chance of improving the performance of EA-A: [2] (a) decrease the bit mutation chance and use a smaller k value for the parent selection k-tournament (b) change k to 0.75 µ [1] (false, because while this does increase selective pressure which is a bit low, it does not address the principal reason which is the far too high mutation rate) (c) decrease λ [0] (false, because λ is already too low for generational survival and it does not address the far too high mutation rate) (d) change the survival selection to random [0] (false, because the high mutation rate coupled with random survival selection has a high chance of loosing the fittest individuals) (e) change the recombination type to 3-point crossover just like EA-B [0] 7. Which of the following configuration changes has a high chance of improving the performance of EA-B: [2] (a) change to bitwise mutation chance and use a bitwise mutation chance of 1/l and change survival selection to a k-tournament without replacement with a k of 5 (b) increase the parent selection tournament k value [0] (EA-B is already suffering from premature convergence so needs lower selective pressure, not higher) (c) decrease population size [0] (EA-B is already suffering from premature convergence so needs more genetic diversity, not less) (d) all of the above [1] 8. Why is MAXSAT a more appropriate fitness function for an EA than SAT? [2] (a) Fitness evaluation for MAXSAT is more efficient [0] (false, because MAXSAT always has to evaluate all clauses where SAT does not always have to do so) (b) All of MAXSAT s local optima will always be global optima [1] (false, local optima may not be fully satisfiable) (c) Unlike SAT, MAXSAT provides a fitness landscape with a gradient, making an informed search more effective (d) Because I can t get no SATisfaction [0] (false, as this is a lame joke with no bearing on the question)

9. EA-B runs for the following number of generations: [2] (a) 199 [1] (b) 200 (this rounds up 199.8 to complete the final generation) [ 1 2 ] (c) 500 [0] (d) (10000-50)/10=9950/10=995 (e) 1000 [1] (f) 10000 [0] The following are some general multiple choice questions: 10. A Hamming cliff is: [2] (a) a cliff-like plot associated with a binary encoded fitness function belonging to an NP-Complete problem class [0] (b) A pair of binary strings which differ in many of their bits (i.e., have a large Hamming distance) exhibit a Hamming cliff (c) a population of individuals with binary representation whose genotypes are very similar and who all have a large Hamming distance to the global optimum causing premature convergence [ 1 2 ] (d) all of the above [0] 11. Binary gray code is: [2] (a) a binary coding system where the Hamming distance between consecutive corresponding integers is proportional to their magnitude [1] (b) a binary coding system specifically developed to represent gray scales in digital images [ 1 2 ] (c) a binary coding system with consecutive corresponding integers differing by no more than 2 bits [1 1 2 ] (d) all of the above [0] (e) none of the above 12. One advantage of implementing survivor selection by employing a so-called reverse k-tournament selection to select who dies is that: [2] (a) you guarantee 1-elitism (i.e., the fittest individual is guaranteed to survive) [ 1 2 ] (b) you guarantee (k)-elitism [1] (c) you guarantee (k 1)-elitism (d) the probability of surviving is proportional to your fitness rank [0] 13. In Evolution Strategies with uncorrelated mutation with n step sizes, the conceptual motivation for updating the mutation step sizes with the formula σ i = σ i e τ N(0,1)+τ N i(0,1) is: [2] (a) the sum of two normally distributed variables is also normally distributed [ 1 2 ] (b) the common base mutation e τ N(0,1) allows for an overall change of the mutability, guaranteeing the preservation of all degrees of freedom [1] (c) the coordinate-specific e τ Ni(0,1) provides the flexibility to use different mutation strategies in different directions [1] (d) all of the above

14. Rechenberg s 1/5 success rule: [2] (a) refers to the minimum succesful mutation rate threshold necessary for an Evolution Strategy to reach the global optimum [1] (b) refers to the ratio of offspring created by mutations versus recombination in Genetic Programming [0] (c) refers to a rule of thumb for the optimal ratio of successful versus total mutations in Evolution Strategies where mutation step size is increased if the ratio is greater than 1 5 and decreased if the ratio is smaller than 1 5 (d) refers to the minimum ratio of succesful offspring creation versus total offspring creation in order for a parent to survive to the next generation [ 1 2 ] 15. Parameter Control is important in EAs because: [2] (a) optimal strategy parameter values may change during evolution [1] (b) it may somewhat relieve users from parameter tuning as parameter control may make an EA less sensitive to initial strategy parameter values [1] (c) all of the above (d) none of the above [0] 16. Blind Parameter Control is a better name for the class of parameter control mechanisms named Deterministic Parameter Control in the textbook because that class: [2] (a) does not use any feedback from the evolutionary process [1] (b) includes stochastic mechanisms [1] (c) all of the above (d) none of the above [0] 17. If we employ self-adaptation to control the value of penalty coefficients for an EA with an evaluation function which includes a penalty function, then: [2] (a) this cannot be done because it is inherently impossible to self-adapt any part of the evaluation function [0] (b) the penalty coefficients will be self-adapted, but the increase in fitness achieved may not be correlated with better performance on the objective function (c) the penalty coefficients will be self-adapted to cause fitness improvement just like, for instance, mutation step sizes [ 1 2 ] (d) all of the above [ 1 2 ] The following are some regular questions: 18. (a) What is the binary gray code for the standard binary number 11110111? [2] 10001100 (b) What is the standard binary number encoded by the binary gray code 111111010? [2] 101010011

19. Given the following two parents with permutation representation: p1 = (912456387) p2 = (785923164) (a) Compute the first offspring with Cycle Crossover. [4] Cycle 1: 9-7-4, Cycle 2: 1-8-6-3, Cycle 3: 2-5 Construction of first offspring by scanning parents from left to right, starting at parent 1 and alternating parents: i. Add cycle 1 from parent 1: 9 4 7 ii. Add cycle 2 from parent 2: 98 4 3167 iii. Add cycle 3 from parent 1: 982453167 (b) Compute the first offspring with PMX, using crossover points between the 2nd and 3rd loci and between the 6th and 7th loci. [5] i. 2456 ii. 2456 39 iii. 782456139 (c) Compute the first offspring with Edge Crossover, except that for each random choice you instead select the lowest element. [10] 1 9,2,3,6 6 5,3,1,4 2 1,4,9,3 7 8+,9,4 Original Edge Table: 3 6,8,2,1 8 3,7+,5 4 2,5,6,7 9 7,1,5,2 Construction Table: Edge Table After Step 1: Edge Table After Step 2: Edge Table After Step 3: Element selected Reason Partial result 1 Lowest 1 2 Equal list size, so lowest 12 3 3,9 equal shortest list size, so lowest 123 6 Equal list size, so lowest 1236 4 Shortest list 12364 5 Equal list size, so lowest 123645 8 Equal list size, so lowest 1236458 7 Only element 12364587 9 Last element 123645879 1 9,2,3,6 6 5,3,4 2 4,9,3 7 8+,9,4 3 6,8,2 8 3,7+,5 4 2,5,6,7 9 7,5,2 6 5,3,4 2 4,9,3 7 8+,9,4 3 6,8 8 3,7+,5 4 5,6,7 9 7,5 6 5,4 7 8+,9,4 3 6,8 8 7+,5 4 5,6,7 9 7,5

Edge Table After Step 4: Edge Table After Step 5: Edge Table After Step 6: Edge Table After Step 7: 6 5,4 7 8+,9,4 8 7+,5 4 5,7 9 7,5 5 4,8,9 7 8+,9 8 7+,5 4 5,7 9 7,5 5 8,9 5 8,9 7 8+,9 8 7+ 9 7 7 9 8 7+ 9 7 (d) Compute the first offspring with Order Crossover, using crossover points between the 3rd and 4th loci and between the 7th and 8th loci. [3] i. Child 1: 4563 ii. Child 1: 921456378