Introduction to Scientific Modeling CS 365, Fall Semester, 2007 Genetic Algorithms

Similar documents
Introduction to Scientific Modeling CS 365, Fall Semester, 2011 Genetic Algorithms

Evolutionary Algorithms. CS Evolutionary Algorithms 1

4/22/2014. Genetic Algorithms. Diwakar Yagyasen Department of Computer Science BBDNITM. Introduction

Introduction to Genetic Algorithms. Based on Chapter 10 of Marsland Chapter 9 of Mitchell

Artificial Intelligence Application (Genetic Algorithm)

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you?

Genetic Algorithms. Chapter 3

Lecture 6: The Building Block Hypothesis. Genetic Algorithms and Genetic Programming Lecture 6. The Schema Theorem Reminder

Genetic Algorithms and Genetic Programming Lecture 7

The Genetic Algorithm for finding the maxima of single-variable functions

Escaping Local Optima: Genetic Algorithm

Genetic Algorithms Variations and Implementation Issues

Introduction to Optimization

Introduction to Optimization

Adaptive Crossover in Genetic Algorithms Using Statistics Mechanism

GENETIC ALGORITHM with Hands-On exercise

Non-deterministic Search techniques. Emma Hart

Genetic Algorithms. Kang Zheng Karl Schober

CHAPTER 4 GENETIC ALGORITHM

An Introduction to Evolutionary Algorithms

Mutations for Permutations

Genetic Algorithm. Dr. Rajesh Kumar

Computational Intelligence

AI Programming CS S-08 Local Search / Genetic Algorithms

Automata Construct with Genetic Algorithm

Genetic Algorithms Presented by: Faramarz Safi (Ph.D.) Faculty of Computer Engineering Islamic Azad University, Najafabad Branch.

Introduction to Genetic Algorithms

Information Fusion Dr. B. K. Panigrahi

What is GOSET? GOSET stands for Genetic Optimization System Engineering Tool

MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS

Outline. Best-first search. Greedy best-first search A* search Heuristics Local search algorithms

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2

Similarity Templates or Schemata. CS 571 Evolutionary Computation

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

Genetic programming. Lecture Genetic Programming. LISP as a GP language. LISP structure. S-expressions

Introduction to Genetic Algorithms. Genetic Algorithms

Evolutionary Computation Part 2

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

Pre-requisite Material for Course Heuristics and Approximation Algorithms

Genetic Algorithms. PHY 604: Computational Methods in Physics and Astrophysics II

Chapter 9: Genetic Algorithms

Lecture 4. Convexity Robust cost functions Optimizing non-convex functions. 3B1B Optimization Michaelmas 2017 A. Zisserman

Path Planning Optimization Using Genetic Algorithm A Literature Review

Using Genetic Algorithms to optimize ACS-TSP

Chapter 14 Global Search Algorithms

Genetic Algorithm Performance with Different Selection Methods in Solving Multi-Objective Network Design Problem

Outline. Motivation. Introduction of GAs. Genetic Algorithm 9/7/2017. Motivation Genetic algorithms An illustrative example Hypothesis space search

Submit: Your group source code to mooshak

Topological Machining Fixture Layout Synthesis Using Genetic Algorithms

Outline. CS 6776 Evolutionary Computation. Numerical Optimization. Fitness Function. ,x 2. ) = x 2 1. , x , 5.0 x 1.

CS5401 FS2015 Exam 1 Key

THE EFFECT OF SEGREGATION IN NON- REPEATED PRISONER'S DILEMMA

Using Genetic Algorithms in Integer Programming for Decision Support

CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS

Machine Learning for Software Engineering

A Web-Based Evolutionary Algorithm Demonstration using the Traveling Salesman Problem

Heuristic Optimisation

[Premalatha, 4(5): May, 2015] ISSN: (I2OR), Publication Impact Factor: (ISRA), Journal Impact Factor: 2.114

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Lecture 8: Genetic Algorithms

Using Genetic Algorithms to Solve the Box Stacking Problem

Genetic Algorithms for Vision and Pattern Recognition

A New Selection Operator - CSM in Genetic Algorithms for Solving the TSP

DETERMINING MAXIMUM/MINIMUM VALUES FOR TWO- DIMENTIONAL MATHMATICLE FUNCTIONS USING RANDOM CREOSSOVER TECHNIQUES

The Binary Genetic Algorithm. Universidad de los Andes-CODENSA

Modeling and Simulating Social Systems with MATLAB

Genetic Programming of Autonomous Agents. Functional Requirements List and Performance Specifi cations. Scott O'Dell

Heuristic Optimisation

REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION. Nedim TUTKUN

Automated Test Data Generation and Optimization Scheme Using Genetic Algorithm

Solving Traveling Salesman Problem Using Parallel Genetic. Algorithm and Simulated Annealing

CS154, Lecture 18: 1

Hyperplane Ranking in. Simple Genetic Algorithms. D. Whitley, K. Mathias, and L. Pyeatt. Department of Computer Science. Colorado State University

DERIVATIVE-FREE OPTIMIZATION

Genetic Algorithms and Genetic Programming. Lecture 9: (23/10/09)

Fuzzy Inspired Hybrid Genetic Approach to Optimize Travelling Salesman Problem

Solving Traveling Salesman Problem for Large Spaces using Modified Meta- Optimization Genetic Algorithm

10703 Deep Reinforcement Learning and Control

ACO and other (meta)heuristics for CO

CONCEPT FORMATION AND DECISION TREE INDUCTION USING THE GENETIC PROGRAMMING PARADIGM

Outline of the module

AN IMPROVED ITERATIVE METHOD FOR SOLVING GENERAL SYSTEM OF EQUATIONS VIA GENETIC ALGORITHMS

Pseudo-code for typical EA

Artificial Intelligence

Multi-objective Optimization

CT79 SOFT COMPUTING ALCCS-FEB 2014

OPTIMIZATION EVOLUTIONARY ALGORITHMS. Biologically-Inspired and. Computer Intelligence. Wiley. Population-Based Approaches to.

Theorem 2.9: nearest addition algorithm

Evolutionary Computation. Chao Lan

March 19, Heuristics for Optimization. Outline. Problem formulation. Genetic algorithms

Genetic Programming. Charles Chilaka. Department of Computational Science Memorial University of Newfoundland

Solving the Travelling Salesman Problem in Parallel by Genetic Algorithm on Multicomputer Cluster

Genetic Programming. and its use for learning Concepts in Description Logics

Time Complexity Analysis of the Genetic Algorithm Clustering Method

Evolving Strategies for the Prisoner's Dilemma

Design and Analysis of Algorithms

Transcription:

Introduction to Scientific Modeling CS 365, Fall Semester, 2007 Genetic Algorithms Stephanie Forrest FEC 355E http://cs.unm.edu/~forrest/cas-class-06.html forrest@cs.unm.edu 505-277-7104

Genetic Algorithms Principles of natural selection applied to computation: Variation Selection Inheritance Evolution in a computer: Individuals (genotypes) stored in computer s memory Evaluation of individuals (artificial selection) Differential reproduction through copying and deletion Variation introduced by analogy with mutation and crossover Simple algorithm captures much of the richness seen in naturally evolving populations.

A Simple Genetic Algorithm Population at T n Population at T n+1 00111 11100 Mutation 01100 11100 11100 11010 01010... Selection 01010... Crossover 01100... F(00111) = 0.1 F(11100) = 0.9 F(01010) = 0.5

Example Performance Curve mean fitness max fitness 100 90 80 70 Fitness 60 50 40 30 20 10 0 20 40 0 60 80 100 120 140 160 180 200 220 240 260 280 300 Generation 320 340 360 380 400 420 440 460 480 500 520 535

Where did these ideas originate? Genetic algorithms (Holland, 1962) Original idea was to create an algorithm that captured the richness of natural adaptive systems. Emphasized the adaptive properties of entire populations and the importance of recombination mechanisms such as crossover. Application to function optimization introduced by DeJong (1975). Evolutionstrategie (Rechenberg, 1965) Emphasized the importance of selection and mutation, as Mechanisms for solving difficult real-valued optimization problems. Evolutionary programming (Fogel et al., 1966) Emphasis on evolving finite state machines. Genetic programming (Koza, 1992) ==> Evolutionary Computation

References J. H. Holland Adaptation in Natural and Artificial Systems. Univ. of Michigan Press (1975). Second Edition published by MIT Press (1992). D. E. Goldberg Genetic Algorithms in Search, Optimization, and Machine Learning. Addison-Wesley (1989). M. Mitchell An Introduction to Genetic Algorithms. MIT Press (1996). S. Forrest Genetic Algorithms: Principles of natural selection applied to computation. Science 261:872-878 (1993). J. Koza Genetic Programming. MIT Press (1992).

Multi-parameter Function Optimization F(x, y) = yx 2 - x 4 0 0 1 1 0 1 Base 2 1 5 Base 10 F(001101) = F(1,5) = 5 1 2-1 4 = 4 Decimal Binary 0 000 1 001 2 010 3 011 4 100 5 101 6 110 7 111

Multi-parameter Function Optimization F(x, y) = yx 2 - x 4 Degray 0 0 1 1 1 1 Bit string (Gray coded) 0 0 1 1 0 1 Base 2 1 5 Base 10 F(001111) = F(1,5) = 5 1 2-1 4 = 4 Decimal Binary Gray code 0 000 000 1 001 001 2 010 011 3 011 010 4 100 110 5 101 111 6 110 101 7 111 100

Example Applications of Genetic Algorithms Engineering applications: Multi-parameter function optimization (e.g., spectroscopic applications, turbine engine design). Sequencing problems (e.g., circuit design, factory scheduling, TSP). Machine and robot learning. Complex data analysis. Automatic programming (e.g., genetic programming). Modeling: Rule discovery in cognitive systems. Learning strategies for games. Affinity maturation in immune systems. Ecosystem modeling.

Implementation Issues Implementation issues Permutation problems and special operators: Example: TSP Genetic programming: Example: Trigonometric functions Modeling applications: Example: Prisoner s Dilemma Example: Classifier Systems Example: Echo

Implementation Issues Data structures: Packed arrays of bits Byte arrays Vectors of real numbers Lists and trees Representation: Feature lists Binary encodings, gray codes Real numbers Permutations Trees Selection On next slide Scaling On next slide Crossover: 1-point, 2-point, n-point Uniform Special operators Mutation: Bit flips Creep (Gaussian noise) Elitism Parameters (rough guidelines): Bitstring length (32-10,000) Population size (100-1000) Length of run (50-10,000) Crossover rate (0.6 per pair) Mutation rate (0.005 per bit)

Selection Methods Fitness-proportionate (used in theoretical studies): Expected value of individual i : Implement as roulette wheel: f exp( i ) = f f i Rank-based: (SKIP) Intended to prevent premature convergence (slow down evolution). Each individual ranked according to fitness. Expected value depends on rank. Min, Max are constants. f rank ( i)! 1 ( i) = Min + ( Max! Min) exp N! 1

Selection Methods cont. Tournament: Computationally efficient. Choose T = size of tournament (2 is a common value). Pick subsets of size T from the population randomly (with replacement). Compare fitnesses within the subset and choose the winner (either deterministically or stochastically). Iterate. Steady state. Implicit (e.g., Echo, immune models, etc.) Reproduction rate proportional to resources obtained.

Scaling Methods How to maintain a steady rate of evolution? (SKIP) Linear: Based on max, min, mean. Require: ' f = af + b f = ' avg f avg Described in Goldberg, 1989. Sigma: Based on mean and standard deviation ' f = f " ( f " c! ) with cutoffs for extreme values. c is typically 1.5 or 2.0 Exponential: f ' = k f Normalizes difference between 1.0 and 1.5, and 1000.0 and 1000.5.

Genetic Programming Evolve populations of computer programs: Typically use the language Lisp. Select a set of primitive functions for each problem. Represent program as a syntax tree. Function approximation vs. function optimization. Crossover operator: Exchange subtrees between individual program trees. Schema Theorem? Many applications: Optimal control (e.g., the pole balancing problem) Circuit design Symbolic regression (data fitting)

Genetic Programming Expression X 2 + 3xy + y 2 LISP (+ (* x x) (*3 x y) (* y y)) + * * * x x 3 x y y y

Genetic Programming cont. Consider evolving a program to compute: cos 2x A human-designed program: 1-2sin 2 x In Lisp: (- 1 (* 2 (* (sin x)(sin x)))) A genetic programming solution: (sin ( - ( - 2 (* x 2)) (sin (sin (sin (sin (sin (sin ( * (sin (sin 1)) (sin (sin 1))))))))))) Junk DNA?

How do Genetic Algorithms Work? The Central Dogma of Genetic Algorithms (Holland, 1975) Schema processing Schema Theorem Implicit parallelism Building block hypothesis K-armed bandit analogy

Genetic Algorithms and Search High-dimensional search spaces: All binary strings of length l. All possible strategies for playing the game of chess. All possible tours in the Travelling Salesman Problem. Genetic algorithms use biased sampling to search high-dimensional spaces: Independent sampling. Selection biases search towards high-fitness regions. Crossover combines partial solutions from different strings. Partial solution formalized as schema. *1** 1*** *0** 0***

Schemas Schemas capture important regularities in the search space: 1 0 0 1 1 1 0 1 0 0 1 1 * * * * 1 1 * * 0 * * * * * 0 * * 1 * * 0 * 1 1 Implicit Parallelism: 1 individual samples many schemas simultaneously. Schema Theorem: Reproduction and crossover guarantee exponentially increasing samples of the observed best schemas. Order of a schema O(s) = number of defined bits. Defining length of a schema D(s) = distance between outermost bits.

Schema Theorem (Holland, 1975) Let: s be a schema in population at time t, N(s,t) be the number of instances of s at time t. Question: What is the expected N(s,t+1)? Assume: Fitness-proportionate selection. Expected number of offspring(x) = Ignoring crossover and mutation, µ ˆ( s, t) N ( s, t + 1) =! N( s, t) F( t) Note: If ˆµ( s, t) F( t) = c, then t N( s, t) = c N( s,0) Crossover and mutation handled as loss terms: N(s,t +1) " ˆ µ (s,t) F(t) # N(s,t)(1$ p c F( x) F( t) D(s) l $1 )[(1$ p m) O(s) ]

Royal Road Schemas schema 1 schema 2 schema 3 Fitness 100 90 80 70 60 50 40 30 20 10 0 0 20 40 60 80 100 120 140 160 180 200 220 240 260 280 300 Generation

Study Question Given a population, consisting of N individuals, Each individual is L bits long, How many schemas are sampled by the population (in one generation)? Hint: Minimum value is: 2 L Maximum value is Nx2 L

Building Blocks Example Fitness 0.0 1.0 Interpret bit string as a binary decimal Genome.1000 = 0.5 1#### 0.5 0#### 0.5.01000 = 0.25 #1### = even intervals #0### = odd intervals

Questions When will genetic algorithms work well and when they will not? Not appropriate for problems where it is important to find the exact global optimum. GA domains are typically those about which we have little analytical knowledge (complex, noisy, dynamic, poorly specified, etc.). Would like a mathematical characterization that is predictive. What makes a problem hard for genetic algorithms? Deception, multi-modality, conflicting schema information, noise,... What makes a problem easy for genetic algorithms? What distinguishes genetic algorithms from other optimization methods, such as hill climbing? What does it mean for a genetic algorithm to perform well?

Building Blocks Hypothesis (Holland, 1975) 1. GA initially detects biases in low-order schemas: GA obtains good estimates of schema average fitness by sampling strings. 2. Over time, information from low-order schemas is combined through crossover, and 3. GA detects biases in high-order schemas, 4. Eventually converging on the most fit region of the space. Implies that crossover is central to the success of the GA. Claim: GA allocates samples to schemas in a near optimal way: K-armed bandit argument.

The Two-armed Bandit Originated in statistical decision theory and adaptive control. Suppose a gambler is given N coins with which to play a slot machine with two arms (A 1 and A 2 ). The arms have: Mean payoff (per trial) rates m 1 and m 2 Variances s 1 2 and s 2 2 The payoff processes from the two arms are each stationary and independent of one another. The gambler does not know these payoffs and can estimate them only by playing coins on the different arms.

Two-armed Bandit cont. Given that the player wants to maximize total payoff, what strategy is best? On-line payoff (maximize payoff during N trials) vs. Off-line payoff (determining which arm has the higher payoff rate). Claim: Optimal strategy is to exponentially increase the sampling rate of the observed best arm, as more samples are collected. Apply to schema sampling in GA: 3 L schemas in an L-bit search space can be viewed as 3 L arms of a multiarmed slot machine. Observed payoff of a schema H is simply its observed fitness. Claim: The GA is a near optimal strategy for sampling schemas. Maximizes on-line performance.

Most GA theory is not practical or predictive, except in a general way. Analysis methods: Walsh polynomials and schema analysis Population genetics models Markov chains Signal-to-noise analysis Statistical structure of fitness landscapes Statistical mechanics models PAC learning Common traps: Infinite populations. Analysis intractable except for short strings (16 or less). Enumerate all possible populations. Convergence proofs based on idea that any string can mutate into any other string. Weak bounds.

Permutation Problems and Special Operators Problem: Find an optimal ordering for a sequence of N items. Examples: Traveling Salesman Problem (TSP) Bin packing problems Scheduling DNA fragment assembly Traveling Salesman Problem: 1 6 3 4 2 2 6 5 2 3

Using Genetic Algorithms to Find Good Tours for TSP Natural representation: Permutations in which each city is labeled by a unique integer (bitstring): 3 2 1 4 4 1 2 3 Problem: Mutation and crossover do not produce legal tours: 3 2 2 3 Solutions: Other representations. Other operators. Penalize illegal solutions through fitness function.

Specialized Operators What information (schemas) should be preserved? Absolute position in the sequence Relative ordering in the sequence (precedence) Adjacency relations How much randomness is introduced? Order crossover (Davis, 1985) Partially-mapped crossover (PMX) (Goldberg, et al. 1985) Cycle crossover (Oliver, et al. 1987) Edge-recombination: Try to preserve adjacencies in parents Favor adjacencies common to both parents When 1,2 fail, make a random selection.

Edge Recombination Operator Algorithm: 1. Select one parent at random and assign the first element in its permutation to be the first one in the child. 2. Select the second element for the child, as follows: If there is an adjacency common to both parents, then choose that. If there is an unused adjacency available from one parent, choose it. If (1) and (2) fail, then pick an adjacency randomly. 3. Select the remaining elements in order by repeating step 2.

Edge Recombination Example 3 6 2 1 4 5 5 2 1 3 6 4 Original Individuals 3 6 4 1 2 5 New Individual Key Adjacent Keys 1 2,2,3,4, 2 1,1,3,6, 3 1,6,6, 4 1,5,6 5 2,4 6 2,3,3,4

Using Genetic Algorithms to Model Natural and Artificial Systems Modeling social systems Evolution as a model of imitation (Prisoner s Dilemma) Economies (stock market) Cognitive systems Induction and learning (Classifier Systems) Genetic evolution Natural ecological systems (Echo) Immune systems Somatic evolution in CancerSim Ecosystem modeling Artificial life models In each of these systems, adaptation is central What can we learn from modeling with genetic algorithms?

Using Complex Systems Approaches to Social Modeling Societies and political systems Sugarscape models (related to Cellular Automata) George Gummerman s agent-based simulation of Anasazi settlements. Economies Stock market models Econophysics approaches to understanding financial markets. Game theory / non-zero sum games Ant colony models (social insects).

Evolution of Cooperation (Axelrod, 1984) What is the Prisoner s Dilemma? Player B Player A Cooperate Defect Cooperate 3,3 0,5 Defect 5,0 1,1 Non-zero sum game: Total number of points is not constant for each configuration. What is the best move for Player 1? Player 2?

Prisoner s Dilemma cont. Iterated Prisoner s Dilemma: Play game for an indeterminate number of steps. Raises possibility of strategies based on the behavior of the other player. What are the possible strategies? RANDOM Always cooperate, always defect Tit-For-Tat Model other player and try to maximize against that model. etc.

Evolution of Cooperation ~1980 Two tournaments: Each strategy (entry) encoded as a computer program. Round robin (everyone plays everyone). Tit-For-Tat (TFT) won both times. 1982-85 Learning algorithms: Can TFT evolve in a competitive environment? How could a TFT strategy evolve in nature? Is there a better strategy? How do strategies develop? Use genetic algorithm to study these questions, In environment of tournament. In a changing environment (co-evolution). In a 3-person game. In an n-person game. In a Norms game.

Prisoner s Dilemma Using Genetic Algorithm Population consists of individual strategies: 1 strategy per chromosome. Each chromosome is a 64 (70) bit string. Each bit specifies the strategy s next move (cooperate or defect) for one particular history of 3 moves. Encoding: Need to remember each player s move for 3 time steps ==> 6 pieces of information. At each point, either player could cooperate or defect (binary decision). 2 6 = 64 possible histories. Value of bit at a given position tells strategy what to do (0 = coop, 1 = defect) in the context of that history. Additional 6 bits encodes assumption about previous interactions before the start of the game.

Prisoner s Dilemma Encoding cont. Let A be the sum formed by, Counting the other s defection as 2 Counting my own defection as 1, And giving weights of, E.g., 16 to the most recent moves, 4 to the move two time steps in the past, and 1 to the move three time steps in the past. History of mutual cooperation for 3 time steps ==> 0. History of mutual defection for 3 times steps ==> 63. What history should we assume for first 3 moves? Mutual cooperation. Genetically determined (need 6 more bits).

Prisoner s Dilemma Using GA Fitness Function Environment of tournament: Select 8 strategies from ~160 tournament entries that predict performance, and play each candidate strategy against those 8 representatives. Co-evolutionary environment: Strategies play against other strategies in the population.

Results for Fixed Env. From a random start, the GA evolved populations whose median member was just as successful as TFT. Most of the evolved strategies resemble TFT. In some rules, strategies were discovered that did significantly better than TFT (in tournament env.): Discriminates between different opponents (based only on its behavior). Adjusts its own behavior to exploit an exploitable opponent. Does this without getting into (too) much trouble with other opponents. Achieved this by defecting on first move, and then apologizing when necessary.

Results in Evolving Environment (From Axelrod, 1997)

Cognitive Modeling using Classifier Systems (Induction, 1985) Focus on problems with the following characteristics: Perpetually novel streams of data (noise, fluctuations, drift). Continuous real-time requirements for actions (complete rationality is infeasible). Implicitly defined goals, such as survival, making money, winning a game. Intermittent payoffs that give no direct information about individual actions. Claim: Many control problems have this form (including cognitive systems). Classifier systems are a learning system that: Creates new categories based on observed regularities in the environment. Continuously updates its model of the environment. Can allocate credit among its components.

What is a Classifier System? Abstract parallel architecture: Forward-chaining rule system. Ecology of program instructions. Learning algorithms: Credit assignment (bucket brigade). Rule discovery (genetic algorithm). Theory of model construction: Homomorphic view of modeling. Q-morphisms.

Classifier System Architecture Rule Discovery Credit Assignment messages Rule Set Message List messages Environment Bucket Brigade Rules are of the form: Genetic Algorithm ##1# / 0000 Don t care = #, Defined values = 0,1 1111, ~0000 / 1010 Multiple conditions, negative conditions ###1 / ###0 Pass through

Bucket Brigade (How to assign credit to individual rules) Each classifier C has a strength S(C,t) : Overall usefulness of rule. Basis of competition between rules. Bids B(C,t) : Based on past usefulness (strength). Relevance to current situation (specificity) R(C) : B(C,t) = b x R(C) x S(C,t) Each rule acts like a middle man in an economy: Producers and consumers of messages. Direct payoff to last classifier in chain.

Bucket Brigade cont. Winning classifiers: Place their message(s) on the message list Pay out to suppliers C S(C,t+1) = S(C,T) - B(C,t) S(C,T+1) = S(C,t) + a x B(C,t), where a = 1 / Num. Of Classifiers. Probability matching vs. optimizing. Variants: Partial matching Taxation Exponentials and coefficients Etc.

Ecological Modeling with Echo Echo is an abstract model of ecological behavior: Large number of interacting agents. Interactions among agents: Trade Combat Mating Agents are distributed over a geography of sites (2-d array). Each site has renewable resources (letters). Agents can migrate between sites. Evolution. Echo adds geography, resources, and interactions to GA s. Echo adds evolution to resource-based ecological models. Echo lacks metabolism.

An Echo Agent Reservoir a b d a c c d b Conditions Tags bba c cddbb abbac ccd aab Trading Resource a Uptake Mask 1 0 1 1 Genome = abbac ccd aab bba c cddbb a 1011 Tags Conditions Uptake Trading Resource

Agent Interactions Agent 1 Agent 2 Mating Tag aa cb Mating Tag Mating Condition cb aa Mating Condition Agent 1 is attracted to agents with a mating tag of CB agent 2 is attracted to agents with a mating tag of AA

The Echo Cycle Pairs of agents are selected to interact: Combat Trade Mating Agents pick resources up from the environment. Sites charge maintenance. Agents die probabilisitically. Sites produce resources. Agents who have not acquired any resources, migrate. Agents replicate asexually, if able.

The Ecology of Echo Stability of interactions among variants (e.g., fly-ant-catepillar). What happens when more resources are added to the system? Relative species abundance (trajectories, duration, distributions). Cataclysmic events (e.g., meteors). Species formation. Isolation effects. Flows of resources. Transition from single-cell to multi-cellular organization.

What can we hope to learn from models like Echo? Patterns of behavior: Flow of resources Cooperation among agents Arms races Communities Critical regions of parameter space. Interactions between local situations and global effects. Effects of exogenous and endogenous changes. Build intuitions about important dependencies and interactions. Identify commonalities among naturally occurring complex adaptive systems.

Sorted Order Representation 0 1 0 1 1 1 0 0 1 1 0 1 0 1 1 0 1 0 Bit String 2 7 1 5 3 2 Key Values 3 1 5 4 2 1 5 4 2 3 Rotated Layout Start at position 2

Sorted Order Representation 0 1 0 1 1 1 0 0 1 1 0 1 0 1 1 0 1 0 Bit String 2 7 1 5 3 2 Key Values 1 2 3 4 5 Position 3 1 5 4 2 1 5 4 2 3 Rotated Layout Start at position 2