OptCE: A Counterexample-Guided Inductive Optimization Solver

Similar documents
Counterexample Guided Inductive Optimization Applied to Mobile Robot Path Planning SBR/LARS 2017

arxiv: v1 [cs.ai] 11 Apr 2017

Verifying C & C++ with ESBMC

ESBMC 1.22 (Competition Contribution) Jeremy Morse, Mikhail Ramalho, Lucas Cordeiro, Denis Nicole, Bernd Fischer

DSVerifier: A Bounded Model Checking Tool for Digital Systems

Applying Multi-Core Model Checking to Hardware-Software Partitioning in Embedded Systems

Handling Loops in Bounded Model Checking of C Programs via k-induction

Model Checking Embedded C Software using k-induction and Invariants

SMT-Based Bounded Model Checking for Embedded ANSI-C Software. Lucas Cordeiro, Bernd Fischer, Joao Marques-Silva

MEMORY MANAGEMENT TEST-CASE GENERATION OF C PROGRAMS USING BOUNDED MODEL CHECKING

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation

System LAV and Its Applications

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

UNDERSTANDING PROGRAMMING BUGS IN ANSI-C SOFTWARE USING BOUNDED MODEL CHECKING COUNTER-EXAMPLES

: A Bounded Model Checking Tool to Verify Qt Applications

LECTURE NOTES Non-Linear Programming

Classical Gradient Methods

An Evolutionary Algorithm for Minimizing Multimodal Functions

Efficiently Solving Bit-Vector Problems Using Model Checkers

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION

Seminar in Software Engineering Presented by Dima Pavlov, November 2010

BITCOIN MINING IN A SAT FRAMEWORK

Applying Multi-Core Model Checking to Hardware- Software Partitioning in Embedded Systems (extended version)

Fast oriented bounding box optimization on the rotation group SO(3, R)

Runtime Checking and Test Case Generation for Python

Optimization. Industrial AI Lab.

Optimizations in the Verification Technique of Automatic Assertion Checking with Non-linear Solver

k-center Problems Joey Durham Graphs, Combinatorics and Convex Optimization Reading Group Summer 2008

Problems of Sensor Placement for Intelligent Environments of Robotic Testbeds

CHAPTER 8 DISCUSSIONS

Combinational Equivalence Checking Using Incremental SAT Solving, Output Ordering, and Resets

Bounded Model Checking of C++ Programs based on the Qt Cross-Platform Framework (Journal-First Abstract)

Analysis and optimization methods of graph based meta-models for data flow simulation

Counterexample-Driven Genetic Programming

Solving Optimization Problems with MATLAB Loren Shure

An Introduction to Satisfiability Modulo Theories

Modern Methods of Data Analysis - WS 07/08

Bayesian Methods in Vision: MAP Estimation, MRFs, Optimization

Integrating Mixed-Integer Optimisation & Satisfiability Modulo Theories

Ant Colony Optimization: A New Stochastic Solver for Modeling Vapor-Liquid Equilibrium Data

Effects of Weight Approximation Methods on Performance of Digital Beamforming Using Least Mean Squares Algorithm

PARTICLE Swarm Optimization (PSO), an algorithm by

Encoding floating-point numbers using the SMT theory in ESBMC: An empirical evaluation over the SV-COMP benchmarks

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

Today. Golden section, discussion of error Newton s method. Newton s method, steepest descent, conjugate gradient

Introduction to ANSYS DesignXplorer

Global Optimization with MATLAB Products

Automated Test Generation using CBMC

Encoding floating-point numbers using the SMT theory in ESBMC: An empirical evaluation over the SV-COMP benchmarks

Symbolic Execution with Interval Constraint Solving and Meta-Heuristic Search

SOLUTION OF THE NONLINEAR LEAST SQUARES PROBLEM USING A NEW GRADIENT BASED GENETIC ALGORITHM

Model Checking of C and C++ with DIVINE 4

Static Program Analysis Part 1 the TIP language

Formal Verification of Embedded Software in Medical Devices Considering Stringent Hardware Constraints

Integrated Math I. IM1.1.3 Understand and use the distributive, associative, and commutative properties.

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 5, NO. 1, FEBRUARY

Satisfiability Modulo Theory based Methodology for Floorplanning in VLSI Circuits

International Journal of Current Research and Modern Education (IJCRME) ISSN (Online): & Impact Factor: Special Issue, NCFTCCPS -

Introduction to Programming in C Department of Computer Science and Engineering. Lecture No. #16 Loops: Matrix Using Nested for Loop

Mathematical Programming and Research Methods (Part II)

Our Strategy for Learning Fortran 90

MATLAB Based Optimization Techniques and Parallel Computing

THE DEVELOPMENT OF THE POTENTIAL AND ACADMIC PROGRAMMES OF WROCLAW UNIVERISTY OF TECHNOLOGY METAHEURISTICS

CS 267: Automated Verification. Lecture 13: Bounded Model Checking. Instructor: Tevfik Bultan

Particle Swarm Optimization

PKIND: A parallel k-induction based model checker

Particle swarm algorithms for multi-local optimization A. Ismael F. Vaz 1, Edite M.G.P. Fernandes 1

Incremental Gradient, Subgradient, and Proximal Methods for Convex Optimization: A Survey. Chapter 4 : Optimization for Machine Learning

Efficient Search for Inputs Causing High Floating-point Errors

Math 302 Introduction to Proofs via Number Theory. Robert Jewett (with small modifications by B. Ćurgus)

Using Genetic Algorithms to optimize ACS-TSP

Mixture Models and the EM Algorithm

Predicate Refinement Heuristics in Program Verification with CEGAR

Use of Non-linear Solver to Check Assertions of Behavioral Descriptions

Driven Cavity Example

State the domain and range of the relation. EX: {(-1,1), (1,5), (0,3)} 1 P a g e Province Mathematics Southwest TN Community College

1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra

JBMC: A Bounded Model Checking Tool for Verifying Java Bytecode. Lucas Cordeiro Pascal Kesseli Daniel Kroening Peter Schrammel Marek Trtik

Active contour: a parallel genetic algorithm approach

Unit 2: Accentuate the Negative Name:

Formalization of Incremental Simplex Algorithm by Stepwise Refinement

Attack Synthesis for Strings using Meta-heuristics

Massively Parallel Seesaw Search for MAX-SAT

Simplicial Global Optimization

ABC basics (compilation from different articles)

F-Soft: Software Verification Platform

Department of Mathematics Oleg Burdakov of 30 October Consider the following linear programming problem (LP):

Research Interests Optimization:

Optimization. 1. Optimization. by Prof. Seungchul Lee Industrial AI Lab POSTECH. Table of Contents

K-Means Clustering Using Localized Histogram Analysis

LocalSolver 4.0: novelties and benchmarks

Subspace Clustering with Global Dimension Minimization And Application to Motion Segmentation

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS

Generating Small Countermodels. Andrew Reynolds Intel August 30, 2012

Evolutionary Algorithms. CS Evolutionary Algorithms 1

Integrating verification in programming languages

Feeding the Fish Weight Update Strategies for the Fish School Search Algorithm

Comparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems

Seventh Grade Spiraling Review Week 1 of First Six Weeks

Transcription:

Federal University of Amazonas (UFAM) Postgraduate Program in Electrical Engineering (PPGEE) OptCE: A Counterexample-Guided Inductive Optimization Solver Higo Albuquerque, Rodrigo Araújo, Iury Bessa, Lucas Cordeiro, and Eddie Lima Mikhail Ramalho mikhail.ramalho-gadelha@soton.ac.uk

Agenda Inductive Optimization Based on Counterexamples Illustrative Example CEGIO Algorithms OptCE Tool Experimental Evaluaion 2

General Steps from Inductive Optimization Based on Counterexamples Modeling In the modeling step, the optimization problem is defined for a cost function and then its constraints are introduced Specification This step consists of describing the behavior of the system and properties to be verified. A C code is generated with ESBMC functions to restrict the state space Verification - This step performs the verification of the C code, and informs if it has found a global optimization 3

Inductive Optimization Based on Counterexamples Given a cost function, such that is the space of decision variables and, where is the set of constraints A multivariate optimization problem consists of finding the vector of optimal decision variables, which minimizes f considering The problem will be non-convex, if and only if f(x) is a non-convex function 4

Inductive Optimization Based on Counterexamples Function of Himmelblau presents four global minima, = + 11 + + 7 5

Inductive Optimization Based on Counterexamples To extend the verifier to solve an optimization problem, two code directives are used: ASSUME and ASSERT ASSUME is responsible for defining the constraints over the non-deterministic variables, from which the verifier restricts the state space ASSERT is used to define the property to be verified and return True or False for the optimization check 6

Inductive Optimization Based on Counterexamples The decision variables of the problem are defined as non-deterministic integers An integer variable controls the accuracy and discretization of the state space wherenis the number of decimal places of the decision variables 7

Inductive Optimization Based on Counterexamples Successive verification are executed iteratively increasing the precision, to converge to the optimal solution Each verification run checks the following property: 8

Illustrative Example Given the following optimization problem: Minimize the Himmelblau function 9

Illustrative Example 1. int nondet_int(); 2. int main(){ 3. int p = 1; 4. float fc = 100; 5. int X1 = nondet_int(); 6. int X2 = nondet_int(); 7. float x1, x2, fobj; 8. ESBMC_assume((X1>=-5*p) && (X1<=0*p)); 9. ESBMC_assume((X2>=0*p) && (X2<=5*p)); 10. x1 = (float) X1/p; 11. x2 = (float) X2/p; 12. fobj = (x1*x1+x2-11)*(x1*x1+x2-11)+(x1+x2*x2-7)*(x1+x2*x2-7); 13. ESBMC_assume( fobj < fc ); 14. ESBMC_assert( fobj > fc, ); 15. return 0; 16. } 10

Illustrative Example 1. int nondet_int(); 2. int main(){ 3. int p = 1; 4. float fc = 100; 5. int X1 = nondet_int(); 6. int X2 = nondet_int(); 7. float x1, x2, fobj; 8. ESBMC_assume((X1>=-5*p) && (X1<=0*p)); 9. ESBMC_assume((X2>=0*p) && (X2<=5*p)); 10. x1 = (float) X1/p; 11. x2 = (float) X2/p; 12. fobj = (x1*x1+x2-11)*(x1*x1+x2-11)+(x1+x2*x2-7)*(x1+x2*x2-7); 13. ESBMC_assume( fobj < fc ); 14. ESBMC_assert( fobj > fc, ); 15. return 0; 16. } The precision variable starts as 10 0 11

Illustrative Example 1. int nondet_int(); 2. int main(){ 3. int p = 1; 4. float fc = 100; 5. int X1 = nondet_int(); 6. int X2 = nondet_int(); 7. float x1, x2, fobj; 8. ESBMC_assume((X1>=-5*p) && (X1<=0*p)); 9. ESBMC_assume((X2>=0*p) && (X2<=5*p)); 10. x1 = (float) X1/p; 11. x2 = (float) X2/p; 12. fobj = (x1*x1+x2-11)*(x1*x1+x2-11)+(x1+x2*x2-7)*(x1+x2*x2-7); 13. ESBMC_assume( fobj < fc ); 14. ESBMC_assert( fobj > fc, ); 15. return 0; 16. } Decision variables are declared as non-deterministic integers 12

Illustrative Example 1. int nondet_int(); 2. int main(){ 3. int p = 1; 4. float fc = 100; 5. int X1 = nondet_int(); 6. int X2 = nondet_int(); 7. float x1, x2, fobj; 8. ESBMC_assume((X1>=-5*p) && (X1<=0*p)); 9. ESBMC_assume((X2>=0*p) && (X2<=5*p)); 10. x1 = (float) X1/p; 11. x2 = (float) X2/p; 12. fobj = (x1*x1+x2-11)*(x1*x1+x2-11)+(x1+x2*x2-7)*(x1+x2*x2-7); 13. ESBMC_assume( fobj < fc ); 14. ESBMC_assert( fobj > fc, ); 15. return 0; 16. } Statements of ASSUMEs are used to specify constraints and reduce state space 13

Illustrative Example 1. int nondet_int(); 2. int main(){ 3. int p = 1; 4. float fc = 100; 5. int X1 = nondet_int(); 6. int X2 = nondet_int(); 7. float x1, x2, fobj; 8. ESBMC_assume((X1>=-5*p) && (X1<=0*p)); 9. ESBMC_assume((X2>=0*p) && (X2<=5*p)); 10. x1 = (float) X1/p; 11. x2 = (float) X2/p; 12. fobj = (x1*x1+x2-11)*(x1*x1+x2-11)+(x1+x2*x2-7)*(x1+x2*x2-7); 13. ESBMC_assume( fobj < fc ); 14. ESBMC_assert( fobj > fc, ); 15. return 0; 16. } 14

Illustrative Example 1. int nondet_int(); 2. int main(){ 3. int p = 1; 4. float fc = 100; 5. int X1 = nondet_int(); 6. int X2 = nondet_int(); 7. float x1, x2, fobj; 8. ESBMC_assume((X1>=-5*p) && (X1<=0*p)); 9. ESBMC_assume((X2>=0*p) && (X2<=5*p)); 10. x1 = (float) X1/p; 11. x2 = (float) X2/p; 12. fobj = (x1*x1+x2-11)*(x1*x1+x2-11)+(x1+x2*x2-7)*(x1+x2*x2-7); 13. ESBMC_assume( fobj < fc ); 14. ESBMC_assert( fobj > fc, ); 15. return 0; 16. } 15

Illustrative Example 1. int nondet_int(); 2. int main(){ 3. int p = 1; 4. float fc = 100; 5. int X1 = nondet_int(); 6. int X2 = nondet_int(); 7. float x1, x2, fobj; 8. ESBMC_assume((X1>=-5*p) && (X1<=0*p)); 9. ESBMC_assume((X2>=0*p) && (X2<=5*p)); 10. x1 = (float) X1/p; 11. x2 = (float) X2/p; 12. fobj = (x1*x1+x2-11)*(x1*x1+x2-11)+(x1+x2*x2-7)*(x1+x2*x2-7); 13. ESBMC_assume( fobj < fc ); 14. ESBMC_assert( fobj > fc, ); 15. return 0; 16. } 16

CEGIO Algorithms CEGIO-G - Applies to general functions CEGIO-F - Applies to semi-definite and positive functions CEGIO-S - Applies to convex functions 17

CEGIO-G Algorithm 18

CEGIO-G Algorithm 19

CEGIO-G Algorithm Updates the restrictions based on the counter example 20

CEGIO-G Algorithm If No, update the precision variable 21

CEGIO-G Algorithm Repeat until desired accuracy is reached 22

Deviating from local minima We evaluated the performance of our methodology to minimize the Himmelblau function, and we compared with the downward gradient (GD) and genetic algorithm (GA) methods. The proposed methodology does not report minimum locations as in GD and GA, and it is able to find the global minimum 23

Deviating from local minima Mathematical techniques and heuristics depend on initialization and can not ensure overall optimization. Local Minimum 24

Deviating from local minima Mathematical techniques and heuristics depend on initialization and can not ensure overall optimization. These techniques can easily stop in a local minimum 25

Deviating from local minima Mathematical techniques and heuristics dependent on initialization and can not assure global optimization Optimization based on SMT achieves the global solution Global Minimum 26

Functions with prior knowledge There are functions, in which we have some a priori knowledge, for example, semi-defined or positive definite functions. Distance or energy functions belong to that class of functions From this, it is possible to propose modifications in the previous algorithm to improve the convergence time 27

CEGIO-F Algorithm 28

CEGIO-F Algorithm 29

CEGIO-F Algorithm 30

CEGIO-F Algorithm 31

Convex functions Another type of special functions are convex functions. These are functions that satisfy triangular inequality. 32

Convex functions 5 4 3 y 2 1 0-1 -2-1 0 1 2 x 33

Convex functions A local minimum of a convex function f, in a convex set, is always a global minimum off It is possible to update the set of constraints of the problem from the solution obtained before increasing the precision The limits are defined by the predecessor and successor values of the solution found so far 34

CEGIO-S Algorithm 35

CEGIO-S Algorithm Restriction set update 36

OptCE: A Counterexample-Guided Inductive Optimization Solver The OptCE tool implements the CEGIOs algorithms Performs optimization based on counterexamples with various configurations of verifiers and solvers Establishes a new approach to optimize functions 37

OptCE: Architecture input file and parameters in terminal 38

OptCE: Architecture generation of the C file with specification 39

OptCE: Architecture Verification of the C code with the verifier and solver established 40

OptCE: Architecture if the result is failed, we obtain a new minimum candidate function 41

OptCE: Architecture The value found is used to specify a new C file, The new candidate function value is used as the start of the algorithm 42

OptCE: Architecture This cycle remains until the check is SUCCESSFUL. 43

OptCE: Architecture When the check is SUCCESSFUL, it means that we have found the global minimum of the function with the defined precision 44

OptCE: Architecture Precision is incremented and checked if it still belongs to the desired precision limit 45

OptCE: Architecture If not (FALSE), we find the global minimum wanted with that precision. If yes (TRUE), we update the precision in the algorithm at runtime to generate a new specification 46

OptCE: Input File Format adopted for constraint matrices Ex: Input file for function adjiman Fobj = cos2(x1)*sin2(x2) ( x1/(x2*x2+1) ); # A = [-1 2; -1 1]; Mathematical functions have been rewritten to simplify the verification process The user can write the math function and insert it into the OptCE math library 47

OptCE Features BMC Configuration: CBMC or ESBMC Solver Configuration: Boolector, Z3, MathSAT, MiniSAT Algorithm Configuration: CEGIO-G, CEGIO-S, CEGIO-F Initialization: Set the optimization start point Insert Library: Insert personal libraries with math functions Timeout: configures the time limit, in seconds Precision: set the desired precision, number of decimal places of a solution 48

Optimizing via OptCE Call Set Properties./optCE name.func --esbmc --mathsat --boolector --z3 --generalized --positive --start-value=? --library=name --timeout=? --timeout=? --cbmc --minisat --convex 49

Experimental Evaluation Objectives Evaluate the performance of the proposed algorithms Check the performance of the SAT and SMT solvers for optimizing the functions Compare the methodology with traditional techniques, such as: genetic algorithm, particle swarm, pattern search, simulated annealing and nonlinear programming 50

Experimental Evaluation Configuration of Experiments A set of 10 functions used for testing optimization algorithms. These have different characteristics, such as: differentiable or non-differentiable, separable or nonseparable, unimodal or multimodal etc. 51

Experimental Evaluation Configuration of Experiments CEGIO-G Algorithm { --generalized} - was employed in all functions CEGIO-S Algorithm { --positive} - was applied to functions Booth, Himmelblau and Leon CEGIO-F Algorithm { --convex} - was used for functions Zettl, Rotated Ellipse and Sum Square 52

Experimental Evaluation Experimental Results - CEGIO-G { --generalized} Considering the proposed combinations, the optimization time varies significantly, where ESBMC + MathSAT is 2.8 times faster than CBMC + MiniSAT, while ESBMC + Z3 presents higher execution time. 53

Experimental Evaluation Experimental Results - CEGIO-S { --positive} The benchmarks executed with the --positive flag had the time reduced considerably, therefore, no checks are made in the negative domain, which reduces the search space. 54

Experimental Evaluation Experimental Results CEGIO-F { --convex} The tests with the benchmarks 8,9,10 using the flag - convex presented a significant reduction in the optimization time, this because, with each step of the verification the search space is reduced. 55

Experimental Evaluation Experimental Results CEGIO algorithms x traditional techniques The great difference of the OptCE in relation to the other techniques is the rate of success. While the other technicians get stuck in local minima, OptCE finds the 56 global minimum.

Conclusion The OptCE tool formalizes a new optimization proposal, which is based on the counter-example analysis of software verifiers. This work allowed to implement the GEGIOs algorithms. The comparisons show that the approach evolved among the CEGIO algorithms, proposing better and more specific solutions in the case of convex and non-negative functions. It is also seen that the tool hit rate is higher than the other analysis techniques. 57

Future work Incorporate checks using other solvers with the MiniSAT. Adapt the tool to run in different cores, increasing the optimization time linearly. Improve the input file. 58