Biology in Computation: Evolving Intelligent Controllers Combining Genetic Algorithms with Neural Networks: Implementation & Application Dimitrios N. Terzopoulos terzopod@math.auth.gr 18/1/2017
Contents Intelligent Control vs. other strategies Development of Intelligence in Nature Artificial Neural Networks & Implementation Genetic Algorithms & Implementation Neuroevolution & Simulation 2
Intelligent Control vs. Other Strategies 3
System Modelling 4
System Modelling 5
Mathematical Model x ( t) = A( t) x( t) + B( t) u( t) y( t) = C( t) x( t) + D( t) u( t) Observability Stability Controllability 6
Control u =? 7
Complex Systems Thermoregulation 8
Complex Systems Blood Pressure Regulation 9
Complex Systems Economy 10
Complex System Modelling System dynamics cannot be modelled easily. Little information about the behavior of the system. Incomplete knowledge of state variables. Too few output variables. Few input variables. 11
Intelligent Modelling Complex Systems contain Knowledge. Knowledge Informal Modelling. Analyze and understand behavior of Systems. What is knowledge? Whatever means something (in a sense): 12
Knowledge Knowledge about the terrain of an area Topographic Relief Representation How? 13
Knowledge can be contained in a Contour Map. Model & Principles 14
Knowledge Knowledge about the normal cardiac function Cardiac Function Representation Aortic Stenosis (and/or, maybe ) Mitral Regurgitation Electrocardiogram & Phonocardiogram 15
Knowledge can be contained in a Phonocardiogram. Model & Principles 16
Knowledge Intelligent Controllers use Knowledge to their Advantage. Knowledge is represented internally. Sometimes, it is not very clear HOW or WHY! Examples: Rules Fuzzy Logic, Inductive Logic Programming, Structures Neural Networks. Machine Learning Support Vector Machines (SVM), 17
Development of Intelligence in Nature 18
Biological Controller Neural Network: Nature s great biological controller 19
Biological Controller Neural Network: Nature s great biological controller Sensors Vision, Audition, Magnetoception, Somatosensation Inputs 20
Biological Controller Neural Network: Nature s great biological controller Sensors Vision, Audition, Magnetoception, Somatosensation Inputs Processor Hidden Layers 21
Biological Controller Neural Network: Nature s great biological controller Sensors Vision, Audition, Magnetoception, Somatosensation Inputs Processor Hidden Layers Decision Outputs 22
Biological Controller Neural Network: Nature s great biological controller. Sensors Vision, Audition, Magnetoception, Somatosensation Inputs Processor Hidden Layers Decision Outputs Action 23
Natural Intelligence Learning vs. Evolution (...both, in real life ) 24
Natural Intelligence Learning Neural Network is recalibrated through outside interaction. Evolution Neural Network changes structure across generations. 25
Natural Intelligence Learning Neural Network is recalibrated through outside interaction. 26
Natural Intelligence Evolution Neural Network changes structure across generations. 27
Artificial Neural Networks 28
Artificial Neuron Simplest Model (Weighted Sum Element) 29
Artificial Neuron Simplest Model (Weighted Sum Element) Soma 30
Artificial Neuron Simplest Model (Weighted Sum Element) Soma (characteristics) 31
Artificial Neuron Simplest Model (Weighted Sum Element) Dendrites 32
Artificial Neuron Simplest Model (Weighted Sum Element) Axon & Synapses 33
Artificial Neural Network Multilayer Feedforward Artificial Neural Network: Acyclic. Organization in Layers. Distinct Input Output Layers. 34
Implementation General Object Oriented Design Classes Constructor Properties Functionality (operators, delegates, etc.) Graphical User Interface 35
Implementation Class Neuron ReceiveInputs(inputs) ProcessSignalQueue() TryFire(neuron, weight) (Enter Inputs in Input Queue) (Sum Activation from Signals) (Send Signals to some neuron) Class Layer (inherited by InputLayer, OutputLayer) PrepareNeurons() ProcessInputs() TryFire(layer) GetOutputs() (Input Layer Only) (Process Neuron Queues) (Send signals to next Layer Neurons) (Output Layer Only) 36
Implementation Class NeuralNetwork ProcessData(inputs) RandomNetwork() (Pass data through network) (Create Network with random values) w 11 w 21 37
Implementation Class NeuralNetwork ProcessData(inputs) RandomNetwork() (Pass data through network) (Create Network with random values) x 1 w 11 w 21 x 2 38
Implementation Class NeuralNetwork ProcessData(inputs) RandomNetwork() (Pass data through network) (Create Network with random values) w 11 w 21 x 1 w 1,o x 2 39
Implementation Class NeuralNetwork ProcessData(inputs) RandomNetwork() w 11 w 21 (Pass data through network) (Create Network with random values) y f wx f wx wx ( ) 1 = 2 i1 i = i = 1 11 1 + 21 2 y 1 x 1 w 1,o x 2 40
Implementation Class NeuralNetwork ProcessData(inputs) RandomNetwork() w 11 w 21 (Pass data through network) (Create Network with random values) y f wx f wx wx ( ) 1 = 2 i1 i = i = 1 11 1 + 21 2 y 1 x 1 w 1,o x 2 2x = = + ( ) tanh ( x) f x 1 e 1 e 41 2x
Implementation Class NeuralNetwork ProcessData(inputs) RandomNetwork() w 11 w 21 (Pass data through network) (Create Network with random values) y f wx f wx wx ( ) 1 = 2 i1 i = i = 1 11 1 + 21 2 y 1 x 1 w 1,o o = w y + 1 1, o 1 x 2 2x = = + ( ) tanh ( x) f x 1 e 1 e 42 2x
Implementation Neurons queue the input signals. Each queue is processed when a layer is activated. When activation > threshold Neuron fires! Fire Signal is enqueued in target neuron s queue. Threshold, bias, activation function Design choices. 43
Implementation In this application: Multilayer Feedforward Neural Networks Many Hidden Layers containing up to 30 Artificial Neurons each. 4 Artificial Neurons for Input & Output Layer. Hyperbolic Tangent Activation Function. No Threshold for Activation. Random connection weights 0<w<1. 44
Genetic Algorithms 45
Genetic Algorithms A form of engineered evolution. Mimic principles of evolution to achieve polymorphism. In short Chromosomes mate to create other chromosomes. 46
Genetic Algorithms Encoding: Convert objects to Chromosomes. Each Chromosome represents one solution. 47
Genetic Algorithms Encoding: Convert objects to Chromosomes. Binary Encoding (A long binary DNA string) 48
Genetic Algorithms Encoding: Convert objects to Chromosomes. Binary Encoding (A long binary DNA string) Solution Parameterization 2 variables P = 1, 2 x x { } 49
Genetic Algorithms Encoding: Convert objects to Chromosomes. Binary Encoding (A long binary DNA string) Solution Parameterization 2 variables P = 1, 2 x x { } C = a a { } 1 0,1 n n x = a am x = a a ( ) 1 1 2 ( ) 2 m+ 1 n 2 Chromosome 50
Genetic Algorithms Encoding: Convert objects to Chromosomes. Value Encoding (The parameterization itself) Solution Parameterization 2 variables P = 1, 2 x x { } 51
Genetic Algorithms Encoding: Convert objects to Chromosomes. Value Encoding (The parameterization itself) Solution Parameterization 2 variables P = 1, 2 x x { } C = 1, 2 { x x } Chromosome 52
Genetic Algorithms Encoding: Convert objects to Chromosomes. Syntax Encoding (Any valid syntactic representation, e.g. a syntax tree) Solution Parameterization A Program 53
Genetic Algorithms Encoding: Convert objects to Chromosomes. Syntax Encoding (Any valid syntactic representation, e.g. a syntax tree) Solution Parameterization A Program C = 54
Genetic Algorithms Encoding: Convert objects to Chromosomes. Syntax Encoding (Any valid syntactic representation, e.g. a syntax tree) Solution Parameterization A Program C = 55
Genetic Algorithms Genetic Operators 56
Genetic Algorithms Genetic Operators Crossover Mutation 57
Genetic Algorithms Genetic Operators Crossover Mutation 58
Genetic Algorithms Genetic Operators Crossover Mutation {011010100010001100011 } {011010100010101100011 } 59
Genetic Algorithms Random initial population is created. Each chromosome is assessed at a task (e.g. solution). Chromosomes are assigned fitness values. 60
Genetic Algorithms One generation cycle consists of: Elimination of some of the least fit. Survival of some of the fittest (Elitism). Some mate to produce offspring (Crossover). Some are randomly altered (Mutation). 61
Genetic Algorithms Many ideas can be transferred from Biology to GAs. Artificial Embryogeny (indirect encoding techniques). Sex Chromosomes (sex-linked gene assignment). etc 62
Implementation Class Chromosome Constructor() Transcribe() (Creates Chromosome DNA from ANN) (Converts Chromosome DNA to ANN) Class NeuralGeneticRecombiner Constructor() (Creates Random ANN Population) Crossover(mom, dad) (Performs crossover between Chromosomes) EvolveGeneration() (Performs steps to progress generation) 63
Implementation In this application: Genetic Algorithm: Randomization instead of Mutation (was destructive). Elimination %: 70% Randomization %: 40% of the surviving (i.e. ~ 28%) Population of 12 Chromosomes Chromosomes have 1 RNG Seed for random #s. After crossover, extra weights might be needed 64
Implementation In this application: Evolution: Chromosomes sorted in decreasing order of fitness. Elimination takes place (8 chromosomes eliminated 4 best stay) Randomization takes place (0.4 * 4 = 1.6 2 new created) Rest 6 chromosomes from crossover between 4 fittest. 65
Neuroevolution 66
Neuroevolution Using genetic algorithms to evolve Neural Networks! 67
Population of Generation i Neuroevolution Elimination Mutation Population of Generation i+1.001001011110 Survival (Elitism)....001101110110.001000111010 Crossover 11010010101 + 000100111010 11010111010... 68
Application Bots in a Maze. 69
Application Bots in a Maze. Geometric aspects of Motion 70
Application The Maze with the bots. 71
Application The Maze (Toroidal bots cross through edges). 72
Application Bots are allowed to explore the Maze. They only feel what s in their Field-of-View. They are controlled by Neural Networks. The Simulation-Evolution Algorithm 73
74
Application How do bots react when a wall is sighted? 75
Application How do bots react when a wall is sighted? 76
Application How do bots react when a wall is sighted? x = P y = P x + 2x + x 4 y + 2y + y 4 L C R L C R 77
Application Four inputs are created (x 1 depends on direction) (x 4 depends on distance) ω x1 = 1 x = 2 2 ap S x3 = x4 = 1 dp 78
Application Neural network pilot processes the data x 1 x 2 x 3 x 4 79
Application Neural network pilot processes the data x 1 x 2 x 3 x 4 80
Application and retrieves the outputs. x 1 x 2 x 3 x 4 y 1 y 2 y 3 y 4 81
Application and retrieves the outputs. x 1 x 2 x 3 x 4 y 1 y 2 y 3 y 4 to guide the bot 82
Application x 1 x 2 x 3 x 4 and retrieves the outputs. y 1 y 2 y 3 y 4 to guide the bot (appropriately) y1 0.5 y1 > y4 y4 dar = y1 0.5 y1 < y4 y4 y3 0.1 y3 > y2 y2 daz = y 3 0.1 y3 < y2 y2 83
Application Bots are rewarded with respect to area covered They get points for hitting milestones. (Where? How much?) 84
Application They get points for hitting milestones. (Where? How much?) 85
Application What does the end result look like? 86
Application 87
Application Bots gather points, they crash (often), and their neural network pilots evolve. Test experiment for few generations. 88
Application 1.8 Evolution of Average Fitness over 8 Generations 1.6 Average Fitness 1.4 1.2 1 0.8 0.6 0.4 Test Run 1 Test Run 2 Test Run 3 Test Run 4 Test Run 5 Test Run 6 0.2 0 0 1 2 3 4 5 6 7 8 Generation 89
Conclusions Any preliminary conclusions? 90
Conclusions Any preliminary conclusions? Elitism Improves accuracy across generations. 91
Conclusions Any preliminary conclusions? Elitism Improves accuracy across generations. 8 generations Too few! Initial outcomes depend on the starting conditions 92
Conclusions Any preliminary conclusions? Elitism Improves accuracy across generations. 8 generations Too few! Initial outcomes depend on the starting conditions Networks with many layers (>8) appear with time. (Not always better). 93
Conclusions Any preliminary conclusions? Number of inputs and outputs is related to number of intermediate hidden layers in terms of efficiency. 94
Conclusions Any preliminary conclusions? Number of inputs and outputs is related to number of intermediate hidden layers in terms of efficiency. Crossover locates optimal region. Mutation locates optimal solution within optimal region. 95
Conclusions Any preliminary conclusions? Number of inputs and outputs is related to number of intermediate hidden layers in terms of efficiency. Crossover locates optimal region. Mutation locates optimal solution within optimal region. Evolution with Learning may prove to be even better! 96
Conclusions Any general conclusions? 97
Conclusions Any general conclusions? Fitness function evaluation is extremely important. It guides the evolution process (e.g. fitness landscape). If not carefully designed, problem is misrepresented. 98
Conclusions Any general conclusions? Fitness function evaluation is extremely important. It guides the evolution process (e.g. fitness landscape). If not carefully designed, problem is misrepresented. Artificial Embryogenesis Very strong improvement. Compaction of genotype. Better scan of solution space. Evolving architecture for ANNs. 99
Thank you all for your attention! 100