Machine Evolution. Machine Evolution. Let s look at. Machine Evolution. Machine Evolution. Machine Evolution. Machine Evolution

Similar documents
Genetic programming. Lecture Genetic Programming. LISP as a GP language. LISP structure. S-expressions

Genetic Programming of Autonomous Agents. Functional Requirements List and Performance Specifi cations. Scott O'Dell

Genetic Programming of Autonomous Agents. Functional Description and Complete System Block Diagram. Scott O'Dell

Evolutionary Computation. Chao Lan

Introduction to Genetic Algorithms. Based on Chapter 10 of Marsland Chapter 9 of Mitchell

Evolutionary Algorithms. CS Evolutionary Algorithms 1

Practice Midterm Exam Solutions

Genetic Algorithms for Vision and Pattern Recognition

Local Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat:

Artificial Intelligence

CHAPTER 5 ENERGY MANAGEMENT USING FUZZY GENETIC APPROACH IN WSN

Escaping Local Optima: Genetic Algorithm

Using Genetic Programming to Evolve Robot Behaviours

Neural Network Weight Selection Using Genetic Algorithms

ANTICIPATORY VERSUS TRADITIONAL GENETIC ALGORITHM

Impact Crater Detection on Mars Digital Elevation and Image Model

Lecture 6: Genetic Algorithm. An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved

Simultaneous Discovery of Detectors and a Way of Using the Detectors via Genetic Programming

Genetic Algorithm for Finding Shortest Path in a Network

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you?

Advanced Search Genetic algorithm

A New Selection Operator - CSM in Genetic Algorithms for Solving the TSP

Multi Expression Programming. Mihai Oltean

A More Stable Approach To LISP Tree GP

Multi-Objective Optimization Using Genetic Algorithms

Heuristic Optimisation

Hill Climbing. Assume a heuristic value for each assignment of values to all variables. Maintain an assignment of a value to each variable.

4/22/2014. Genetic Algorithms. Diwakar Yagyasen Department of Computer Science BBDNITM. Introduction

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

Final Project Report: Learning optimal parameters of Graph-Based Image Segmentation

Lecture 8: Genetic Algorithms

Genetic Programming. and its use for learning Concepts in Description Logics

METAHEURISTICS Genetic Algorithm

Clustering Analysis of Simple K Means Algorithm for Various Data Sets in Function Optimization Problem (Fop) of Evolutionary Programming

CHAPTER 4 GENETIC ALGORITHM

CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS

Biology in Computation: Evolving Intelligent Controllers

CONCEPT FORMATION AND DECISION TREE INDUCTION USING THE GENETIC PROGRAMMING PARADIGM

Preliminary Background Tabu Search Genetic Algorithm

The k-means Algorithm and Genetic Algorithm

1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra

Inducing Parameters of a Decision Tree for Expert System Shell McESE by Genetic Algorithm

A Comparative Study of Linear Encoding in Genetic Programming

ADAPTATION OF REPRESENTATION IN GP

Mesh Generation. Quadtrees. Geometric Algorithms. Lecture 9: Quadtrees

Investigating the Application of Genetic Programming to Function Approximation

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2

GENETIC ALGORITHM with Hands-On exercise

CS:4420 Artificial Intelligence

Polyhedron Evolver Evolution of 3D Shapes with Evolvica

Optimizing the Sailing Route for Fixed Groundfish Survey Stations

The Binary Genetic Algorithm. Universidad de los Andes-CODENSA

Genetic Algorithms and Genetic Programming Lecture 9

March 19, Heuristics for Optimization. Outline. Problem formulation. Genetic algorithms

Genetic Algorithms. PHY 604: Computational Methods in Physics and Astrophysics II

Solving Traveling Salesman Problem Using Parallel Genetic. Algorithm and Simulated Annealing

CS 1567 Intermediate Programming and System Design Using a Mobile Robot Aibo Lab3 Localization and Path Planning

How Santa Fe Ants Evolve

Deep Neural Network Hyperparameter Optimization with Genetic Algorithms

CS5401 FS2015 Exam 1 Key

Review: Final Exam CPSC Artificial Intelligence Michael M. Richter

Genetic Algorithms Variations and Implementation Issues

1 Lab + Hwk 5: Particle Swarm Optimization

METAHEURISTIC. Jacques A. Ferland Department of Informatique and Recherche Opérationnelle Université de Montréal.

Chapter 5 Components for Evolution of Modular Artificial Neural Networks

Genetic Algorithm for Dynamic Capacitated Minimum Spanning Tree

DERIVATIVE-FREE OPTIMIZATION

Preprocessing of Stream Data using Attribute Selection based on Survival of the Fittest

Introduction to Design Optimization: Search Methods

Learning Composite Operators for Object Detection

EVOLUTIONARY OPTIMIZATION OF A FLOW LINE USED ExtendSim BUILT-IN OPTIMIZER

MDL-based Genetic Programming for Object Detection

Outline. Best-first search. Greedy best-first search A* search Heuristics Local search algorithms

GENETIC ALGORITHM METHOD FOR COMPUTER AIDED QUALITY CONTROL

Optimization Technique for Maximization Problem in Evolutionary Programming of Genetic Algorithm in Data Mining

Genetic Programming Prof. Thomas Bäck Nat Evur ol al ut ic o om nar put y Aling go rg it roup hms Genetic Programming 1

Artificial Intelligence Application (Genetic Algorithm)

3.6.2 Generating admissible heuristics from relaxed problems

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

MINIMAL EDGE-ORDERED SPANNING TREES USING A SELF-ADAPTING GENETIC ALGORITHM WITH MULTIPLE GENOMIC REPRESENTATIONS

Evolutionary origins of modularity

Topological Machining Fixture Layout Synthesis Using Genetic Algorithms

1 Lab 5: Particle Swarm Optimization

Introduction to Genetic Algorithms. Genetic Algorithms

A Structural Optimization Method of Genetic Network Programming. for Enhancing Generalization Ability

Garbage Collection: recycling unused memory

Lecture 4. Convexity Robust cost functions Optimizing non-convex functions. 3B1B Optimization Michaelmas 2017 A. Zisserman

Generation of Ultra Side lobe levels in Circular Array Antennas using Evolutionary Algorithms

Experimental Comparison of Different Techniques to Generate Adaptive Sequences

1. Meshes. D7013E Lecture 14

Hybrid Adaptive Evolutionary Algorithm Hyper Heuristic

Genetic Programming for Julia: fast performance and parallel island model implementation

ISSN: [Keswani* et al., 7(1): January, 2018] Impact Factor: 4.116

Genetic Algorithms. Genetic Algorithms

1 Lab + Hwk 5: Particle Swarm Optimization

What is GOSET? GOSET stands for Genetic Optimization System Engineering Tool

Genetic Programming: A study on Computer Language

AI Programming CS S-08 Local Search / Genetic Algorithms

A GENETIC ALGORITHM FOR CLUSTERING ON VERY LARGE DATA SETS

Outline. Motivation. Introduction of GAs. Genetic Algorithm 9/7/2017. Motivation Genetic algorithms An illustrative example Hypothesis space search

Transcription:

Let s look at As you will see later in this course, neural networks can learn, that is, adapt to given constraints. For example, NNs can approximate a given function. In biology, such learning corresponds to the learning by an individual organism. However, in nature there is a different type of adaptation, which is achieved by evolution. Can we use evolutionary mechanisms to create learning programs? 1 2 Fortunately, on our computer we can simulate evolutionary processes faster than in real-time. We simulate the two main aspects of evolution: Generation of descendants that are similar but slightly different from their parents, Selective survival of the fittest descendants, i.e., those that perform best at a given task. Iterating this procedure will lead to individuals that are better and better at the given task. Let us say that we wrote a computer vision algorithm that has two free parameters x and y. We want the program to learn the optimal values for these parameters, that is, those values that allow the program to recognize objects with maximum probability p. To visualize this, we can imagine a 3D landscape defined by p as a function of x and y. Our goal is to find the highest peak in this landscape, which is the maximum of p. 3 4 We can solve this problem with an evolutionary approach. Any variant of the program is completely defined by its values of x and y and can thus be found somewhere in the landscape. We start with a random population of programs. Now those individuals at higher elevations, who perform better, get a higher chance of reproduction than those in the valleys. Reproduction can proceed in two different ways: Production of descendants near the most successful individuals ( single parents ) Production of new individuals by pairs of successful parents. Here, the descendants are placed somewhere between the parents. 5 6 1

The fitness (or performance) of a program is then a function of its parameters x and y: The initial random population of programs could look like this: 7 8 Only the most successful programs survive and generate children that are similar to themselves, i.e., close to them in parameter space: 9 10 Again, only the best ones survive and generate offspring: and so on until the population approaches maximum fitness. 11 12 2

Instead of just varying a number of parameters, we can evolve complete programs (genetic programming). Let us evolve a wall-following robot in grid-space world. The robot s behavior is determined by a LISP function. We use four primitive functions: AND(x, y) = 0 if x = 0; else y OR(x, y) = 1 if x = 1; else y NOT(x) = 0 if x = 1; else 1 IF(x, y, z) = y if x = 1; else z The robot receives sensory inputs n, ne, e, se, s, sw, w, and nw. These inputs are 0 whenever the corresponding cell is free, otherwise they are 1. The robot can move either north, east, south, or west. In genetic programming, we must make sure that all syntactically possible expressions in a program are actually defined and do not crash our system. 13 14 We start with a population of 5000 randomly created programs and let them perform. We let the robot start ten times, each time starting in a different position. Each time, we let the robot perform 60 steps and count the number of different cells adjacent to a wall that the robot visits. There are 32 such cells, so our fitness measure ranges from 0 (lowest fitness) to 320 (highest fitness). Example for a perfect wall-following robot program in LISP: 15 16 The best-performing program among the 5000 randomly generated ones: In generation i + 1, 500 individuals are directly copied from generation i 4500 are created by crossover operations between two parents chosen from the 500 winners. In about 50 cases, mutation is performed. 17 18 3

Example for a crossover operation: After six generations, the best program behaves like this: Mutation is performed by replacing a subtree of a program with a randomly created subtree. 19 20 And after ten generations, we already have a perfect program (fitness 320): Here is a diagram showing the maximum fitness as a function of the generation number: 21 22 You could simulate an evolutionary process to improve your Isola playing algorithm. The easiest way to do this would be to use evolutionary learning to find the optimal weight vector in your static evaluation function, i.e., optimal weighting for each evaluation feature that you compute. For example, assume that you are using the features f 1 (number of neighboring squares) and f 2 (number of reachable squares). In each case, you actually use the difference between the value for yourself and the value for your opponent. Then you could use weights w 1 and w 2 to compute your evaluation function: e(p) = w 1 f 1 + w 2 f 2 So the performance of your algorithm will depend on the weights w 1 and w 2. This corresponds to the example of the computer vision algorithm with two free parameters. Thus you could use an evolutionary process to find the best values for w 1 and w 2 just like in that example. 23 24 4

But how can you determine which individuals survive and procreate? Well, one possibility would be to hold a tournament in which all individuals compete (or many smaller tournaments), and only the best n individuals will reach the next generation, i.e., the next tournament. The other individuals are deleted and replaced with new individuals that use similar weights as the winners. This way you will evolve algorithms of better and better performance, or in other words, you will approach the best values for w 1 and w 2. You could slightly modify the game code to implement this principle of evolution. When you have obtained the best values for w 1 and w 2 (or in your case maybe w 1, w 2,, w 37 ), just transfer these values into your original program. Your program should now play significantly better than it did prior to its evolutionary improvement. Try it out! 25 26 5