Genetic algorithms for solving a class of constrained nonlinear integer programs

Size: px
Start display at page:

Download "Genetic algorithms for solving a class of constrained nonlinear integer programs"

Transcription

1 Intl. Trans. in Op. Res. 8 (2001) 61±74 Genetic algorithms for solving a class of constrained nonlinear integer programs Ruhul Sarker, Thomas Runarsson and Charles Newton School of Computer Science, University of New South Wales, ADFA, Northcott Drive, Canberra, ACT 2600, Australia ruhul@cs.adfa.edu.au Received 7 July 1999; received in revised form 6 April 2000; accepted 4 May 2000 Abstract We consider a class of constrained nonlinear integer programs, which arise in manufacturing batch-sizing problems with multiple raw materials. In this paper, we investigate the use of genetic algorithms (GAs) for solving these models. Both binary and real coded genetic algorithms with six different penalty functions are developed. The real coded genetic algorithm works well for all six penalty functions compared to binary coding. A new method to calculate the penalty coef cient is also discussed. Numerical examples are provided and computational experiences are discussed. Keywords: Nonlinear, integer, penalty function, genetic algorithms, batch sizing 1. Introduction Consider a constrained nonlinear programming problem. Problem P1: Minimize f (x) Subject to g(x) < 0 h(x) ˆ 0 x 2 X where g is a vector function with components g 1,..., g m and h is a vector function with components h 1,..., h k. Here f, g 1,..., g m, h 1,..., h k are functions on E n and X is a nonempty set in E n. The set X represents simple constraints that could be easily handled explicitly, such as lower and upper bounds on the variables. There are several existing conventional methods to solve nonlinear programming problems such as penalty function, barrier function, reduced gradient, and other methods. In the penalty and barrier # 2001 International Federation of Operational Research Societies. Published by Blackwell Publishers Ltd

2 62 R. Sarker, T. Runarsson, C. Newton / Intl. Trans. in Op. Res. 8 (2001) 61±74 function methods, the unconstrained subproblem becomes extremely ill-conditioned for extreme values of the penalty/barrier parameters. Reduced gradient methods have dif culties following the boundary of high nonlinear constraints. These methods rely on local gradient information; the optimum is the best in the neighborhood of the current point. To enhance the reliability and robustness of the search technique in constrained optimization, a GA-based search method incorporating several penalty functions is tested in this paper. We consider a manufacturing batch-sizing problem with multiple raw materials. A single raw material unconstrained manufacturing batch-sizing model was developed by Sarker and Khan (1997; 1999). Later, this model was modi ed by Sarker and Newton (2000) to incorporate the transportation module for nished products delivery. In this paper, we reformulated the problem of Sarker and Khan (1997; 1999) to accommodate multiple raw materials and impose constraints to make the situation more realistic. The formulation of the new problem comes from a class of constrained nonlinear integer programming problems. These are known as hard problems in the operational research literature and there is no simple and easy way to implement solution techniques for these problems. Although there are several heuristics and line-search based methods for solving unconstrained joint batch-sizing problems, no specialized algorithm has appeared for constrained problems. However, the constrained problems can be solved using the theory of nonlinear programs, though these are not easy to implement. Our rst attempt was to solve a simple representative problem using a commercial optimization package LINDO/LINGO. The package provides suboptimal solutions for some known problems (Sarker and Newton, 1999). This encouraged us to explore the use of other methods, for example evolutionary algorithms (EAs). In this research, our purpose is to explore the use of GAs for solving this class of constrained nonlinear integer programming problems. We choose a GA as a heuristic because it is well known for its success and ease of implementation. We also use evolution strategies (ES) for the same problem but that has not been reported in this paper because of its poor performance (Runarsson and Sarker, 1999). Earlier we have used simulated annealing (SA) for unconstrained problems (Sarker and Yao, forthcoming). Like the conventional optimization approach, the GAs used the penalty-function method to convert the constrained problems into unconstrained problems. There are a number of methods available in the literature which have not been investigated to nd the appropriate penalty function and the relevant parameters for a class of models considered in this paper. Six penalty-function based GAs are investigated to nd their suitability for solving the problems. We also discuss a new method, which is currently under investigation, to calculate an appropriate penalty coef cient. We use both binary coding and real coded genetic algorithms with different crossovers and mutations. Usually the binary coding is recognized as the most suitable encoding for any problem solution because it maximizes the number of schemata being searched implicitly (Holland, 1975; Goldberg, 1989), but there have been many examples in the evolutionary computation literature where alternative representations have resulted in algorithms with greater ef ciency and optimization effectiveness when applied to identical problems (see for example articles by Back and Schwefel (1993) and Fogel and Stayton (1994)). Davis (1991) and Michalewicz (1996) comment that for many applications real values or other representations may be chosen to advantage over binary coding. There does not appear to be any general bene t in maximizing implicit parallelism in evolutionary algorithms, and, therefore, forcing problems to t into a binary representation may not be recom-

3 R. Sarker, T. Runarsson, C. Newton / Intl. Trans. in Op. Res. 8 (2001) 61±74 63 mended. In our case, the real coded genetic algorithm works better than binary coding. The detailed computational experiences are presented. The organization of the paper is as follows: following this introduction, this paper presents a brief introduction on penalty function methods in Section 2, and then on genetic algorithms, in Section 3. In Section 4, the batch-sizing problem is described, and the mathematical formulation of the problem is presented in Section 5. In Section 6, the different transformation methods are provided. The modelsolving approach using GAs is implemented in Section 7. The computational experiences are presented in Section 8. Finally, conclusions are provided in Section Penalty function method The penalty function method converts the problem into an equivalent unconstrained problem and then solves it using a suitable search algorithm. Two basic types of penalty functions exist: exterior penalty functions, which penalize infeasible solutions, and interior penalty functions, which penalize feasible solutions. We will discuss only the exterior penalty functions in this paper as the implementation of interior penalty functions is considerably more complex for multiple constraint cases (Smith and Coit, 1997). Three degrees of exterior penalty functions exist: (i) barrier methods in which no infeasible solution is considered; (ii) partial penalty functions in which a penalty is applied near the feasibility boundary; (iii) global penalty functions that are applied throughout the infeasible region (Schwefel, 1995). In general, a penalty function approach places the constraints into the objective function via a penalty parameter in such a way that it penalizes any violation of the constraints. A general form of the unconstrained problem is as follows: Problem P2: Minimize f p (x) ˆ f (x) ìá(x) Subject to x 2 X where ì. 0 is a large number, and á(x) is the penalty function. We de ne the problem P2 as a penalty problem and the objective function of P2 as the penalized objective function ( f p (x)) which is the simple sum of the unpenalized objective function and a penalty (for a minimization problem). A suitable penalty function incurs a positive penalty for infeasible points and no penalty for feasible points. The solution to the penalty problem can be made arbitrarily close to the optimal solution of the original problem by choosing ì suf ciently large. However, if we choose a very large ì and attempt to solve the penalty problem, we may get into some computational dif culties of ill-conditioning (Bazaraa and Shetty, 1979). With a large ì, more emphasis is placed on feasibility, and most procedures for unconstrained optimization will move quickly toward a feasible point. Even though this point may be far from optimal, premature termination could occur. As a result of the above dif culties associated with large penalty parameters, most algorithms use penalty functions that employ a sequence of increasing penalty parameters. With each new value of the penalty parameter, an optimization

4 64 R. Sarker, T. Runarsson, C. Newton / Intl. Trans. in Op. Res. 8 (2001) 61±74 technique is employed, starting with the optimal solution corresponding to the previously chosen parameter value. It is very dif cult to choose an appropriate penalty coef cient (Michalewicz, 1995). A lower value of penalty coef cient means a higher number of iterations is required to solve the penalty problem. Several methods for selecting a penalty coef cient are discussed later. 3. Introduction to GA During the last two decades there has been a growing interest in algorithms which are based on a principle of evolutionðsurvival of the ttest. A common term, accepted recently, refers to such techniques as evolutionary computation (EC) methods. The best known algorithms in this class include genetic algorithms, evolutionary programming, evolution strategies, and genetic programming. There are also many hybrid systems which incorporate various features of the above paradigms, and consequently are hard to classify; they are generally referred to as EC methods (Khouja et al., 1998). The methods of EC are stochastic algorithms whose search methods model some natural phenomena: genetic inheritance and Darwinian strife for survival. GAs follow a step-by-step procedure that mimics the process of natural evolution, following the principles of natural selection and `survival of the ttest'. In these algorithms a population of individuals (potential solutions) undergoes a sequence of unary (mutation-type) and higher order (crossover-type) transformations. These individuals strive for survival. A selection scheme, biased towards tter individuals, selects the next generation. This new generation contains a higher proportion of the characteristics possessed by the `good' members of the previous generation, and in this way good characteristics are spread over the population and mixed with other good characteristics. After a number of generations, the program converges and the best individuals represent a near-optimum solution. The GAs procedure is shown below. begin t 0 initialize Population (t) evaluate Population (t) while (not terminate-condition) do begin t t 1 select Population (t) from Population (t 1) variate Population (t) evaluate Population (t) end end 4. Batch-sizing problem (BSP) Consider a batch manufacturing environment that processes raw materials procured from outside suppliers to convert them into nished products for retailers. The manufacturing batch size is

5 R. Sarker, T. Runarsson, C. Newton / Intl. Trans. in Op. Res. 8 (2001) 61±74 65 dependent on the retailer's sales volume (/market demand), unit product cost, set-up cost, and inventory holding cost. The raw-material purchasing lot size is dependent upon the raw-material requirement in the manufacturing system, unit raw-material cost, ordering cost and inventory holding cost. Therefore, the optimal raw-material purchasing quantity may not be equal to the raw material requirement for an optimal manufacturing batch size. To operate the manufacturing system optimally, it is necessary to optimize the activities of both raw-material purchasing and production batch sizing simultaneously, taking all operating parameters into consideration. A larger manufacturing batch size reduces the set-up cost component to the overall unit product cost. The products produced in one batch (/one manufacturing cycle) are delivered to the retailer in m small lots (that is, m retailer cycles per one manufacturing cycle) at xed time intervals. This delivery system is common in a batch manufacturing environment (Sarker and Parija, 1994; 1996). In the pharmaceuticals and chemicals industries, the critical products are not delivered to the retailers until the whole lot is nished and quality certi cation is ready (Sarker and Khan, 1999; Sarker and Newton, 2000). The inventory level of such products increases linearly at the production rate during the production up-time, and it forms a staircase pattern during production down-time in each inventory cycle. The manufacturer procures raw materials from outside suppliers in a lot of xed quantity at an equal interval of time. In this case, the manufacturer has a number of options on how much to receive in each lot as compared to the manufacturing batch size (see Sarker and Khan, 1999). In this paper, we consider that a lot of raw material will be consumed in n manufacturing cycles. This is a typical case of having a higher ordering cost relative to an inventory holding cost. 5. Mathematical formulation The unconstrained models under various situations are presented in Sarker and Khan (1997; 1999) and Sarker and Newton (2000), and the single raw material constrained model has recently been developed by Sarker and Newton (1999). In this paper, we extend the single raw material model into a multiple raw materials constrained model. The notations used in developing the model are as follows: D p ˆ demand rate of a product p, units per year P p ˆ production rate, units per year (here, P p. D p ) Q p ˆ production lot size H p ˆ annual inventory holding cost, $=unit=year A p ˆ set-up cost for a product p ($=setup) r i ˆ amount/quantity of raw material i required in producing one unit of a product D i ˆ demand of raw material i for the product p in a year, D i ˆ r i D p Q i ˆ ordering quantity of raw material i A i ˆ ordering cost of a raw material i H i ˆ annual inventory holding cost for raw material i PR i ˆ price of raw material i

6 66 R. Sarker, T. Runarsson, C. Newton / Intl. Trans. in Op. Res. 8 (2001) 61±74 Q i ˆ optimum ordering quantity of raw material i x p ˆ shipment quantity to customer at a regular interval (units/shipment) L ˆ time between successive shipments ˆ x p =D p T ˆ cycle time measured in years ˆ Q p =D p m p ˆ number of full shipments during the cycle time ˆ T=L n i ˆ number of production cycles which consume one lot of raw material i ˆ Q i =Q p s i ˆ space required by one unit of raw material i S p ˆ space required by one unit of nished product TC ˆ total cost of the system The mathematical model is presented below: Minimize TC ˆ Dp A p X! A i m px p m p x p n i i 2 D p 1 H p X P p i r i H i! D p n i 1 P p x p 2 H p Subject to: X n i r i m p x p s i < Raw_s_Cap i X n i r i m p x p s i > Min_Truck_Load i m p x p s p < Finish_S_Cap m p and n i are integers and greater than zero. This is clearly a nonlinear-integer program. The rst constraint indicates that the storage space required by the raw materials must be less than the space speci ed for raw materials. The second constraint represents the lower limit truckload for transportation and the third constraint is for the nished products storage capacity. 6. Transformation methods The transformation method, from a constrained to an unconstrained problem, in evolutionary algorithms is almost the same as the classical approach but with a slight difference. The value of ì k is changed as a function of generation k, but not equal to ì k 1. Several selected transformation techniques used in the evolutionary computation literature for solving constrained optimization problems are presented below. Most, if not all, are of the exterior kind which will allow the initial population of solutions to be partially or completely infeasible. In some of these techniques, the controlling parameters are updated in a predetermined manner. The controlling (or penalty) parameters ì k may be updated each generation or every n generations Static penalties The method of static penalties (Homaifar et al., 1994) assumes that for every constraint we establish a family of intervals that determine the appropriate penalty coef cient. It is clear that the results are parameter dependent. A limited set of experiments reported by Michalewicz (1995) indicate that the

7 method can provide good results if violation levels and penalty coef cients are tuned to the problem. We use three levels of xed penalty coef cients for each constraint: 0.5, 0: and 0: Dynamic penalties Joines and Houck (1994) proposed dynamic penalties. The authors assumed that ì k ˆ (Ck) á, where C and á are constants. A reasonable choice for these parameters is C ˆ 0:5, and á ˆ 2. This method requires a much smaller number (independent of the number of constraints) of parameters than the rst method. Also, instead of de ning several levels of violation, the pressure on infeasible solutions is increased due to the (Ck) á component of the penalty term: towards the end of the process (for high values of the generation number k) this component assumes large values Annealing penalties The method of annealing penalties, called Genocop II (for Genetic Algorithms for Numerical Optimization of Constrained Problems) is also based on dynamic penalties and was described by Michalewicz and Attia (1994) and Michalewicz (1996). In this system, a xed ì k is used for all constraints of a given generation k, where ì k ˆ 1=2ô. An initial temperature ô is set and is used to evaluate the individuals. After a certain number of generations, the temperature ô is decreased and the best solution found so far serves as a starting point for the next iteration. This process continues until the temperature reaches freezing point. Genocop II proved its usefulness for linearly constrained optimization problems; it gave a surprisingly good performance for many functions (Michalewicz et al., 1994; Michalewicz, 1996). Michalewicz and Attia (1994) proposed the control parameters to be ô k 1 ˆ 10:ô k, where ô 0 ˆ 1 and ô k remains constant once it reaches Adaptive penalties Adaptive transformation attempts to use the information from the search to adjust the control parameters. This is usually done by examining the tness of feasible and infeasible members in the current population. Bean and Hadi-Alouane proposed an adaptive penalty method in 1992 which uses the feedback from the search process (see Michalewicz and Schoenauer, 1996). This method allows either an increase or a decrease of the imposed penalty during evolution as shown below. This involves the selection of two constants, â 1 and â 2 (â 1. â 2. 1), to adaptively update the penalty function multiplier, and the evaluation of the feasibility of the best solution over successive intervals of N f generations. As the search progresses, the penalty function multiplier is updated every N f generations based on whether or not the best solution was feasible during that interval. The parameters were chosen to be similar to the above methods with N f ˆ 3, â 1 ˆ 5, â 2 ˆ 10 and ì 0 ˆ 1. In our experiments, we always increased the ì values by Death penalty R. Sarker, T. Runarsson, C. Newton / Intl. Trans. in Op. Res. 8 (2001) 61±74 67 The death-penalty method just rejects infeasible individuals. It can be interpreted as a truncation selection based on á(x) and then followed by a selection procedure based on f (x). In this method, the

8 68 R. Sarker, T. Runarsson, C. Newton / Intl. Trans. in Op. Res. 8 (2001) 61±74 initial population must be feasible. In the experiments reported by Michalewicz (1995), it was shown that the method generally gives a poor performance. Also, the method is not as stable as others; the standard deviation of solutions returned is relatively high Superiority of feasible points The method of superiority of feasible points was developed by Powell and Skolnick (1993) and is based on a classical penalty approach, with one notable exception. Each individual is evaluated by the formula: f p (x) ˆ f (x) ì[á(x) è(t, x)], where ì is a constant; however, the component è(t, x) isan additional iteration-dependent function that in uences the evaluation of infeasible solutions. The point is that the method distinguishes between feasible and infeasible individuals by adopting an additional heuristic rule: For any feasible individual x and any infeasible individual y, f p (x), f p (y) (i.e., any feasible solution is better than any infeasible one). The penalties are increased for infeasible individuals. The method performs reasonably well. However, for some case problems, the method may have some dif culty in locating a feasible solution (Michalewicz, 1995). 7. Solving BSP using genetic algorithms Two types of genetic algorithms were tested, for each of the methods discussed in the previous section: a simple genetic algorithm (SGA) and a real coded genetic algorithm (RGA). The basic components for these two algorithms are discussed brie y below. The rst hurdle to overcome in using GAs is problem representation. The representation often relies on binary coding. GAs work with a population of competing strings, each of which represents a potential solution for the problem under investigation. The individual strings within the population are gradually transformed using biological-based operations. In accordance with the law of the survival of the ttest, the best-performing individual in the population will eventually dominate the population. We also use real coding where the variable values are generated as real numbers. The next step is to initialize the population. There are no strict rules for determining the population size. Larger populations ensure greater diversity but require more computing resources. We use a population size of 50, as commonly used by many researchers. Once the population size is chosen, then the initial population must be randomly generated. For binary coding, the random number generator generates random bits (0 or 1). Each individual in the population is a string of n bits. For real coding, the actual numbers are generated randomly. Now that we have a population of potential problem solutions, we need to see how good they are. Therefore, we calculate a tness, or performance, for each string. The tness function is the penalty objective ( f p ) function in our case. For binary coding, each string is decoded into its decimal equivalent. This gives us a candidate value for the solution. This candidate value is used to calculate the tness value. For real coding, we use the variable values directly. Genetic algorithms work with a population of competing problem solutions that gradually evolve over successive generations. We use a binary tournament selection, where two individuals are selected at random from the population pool and the better one is allowed to survive to the next generation. The chromosomes which survive the selection step undergo genetic operations, crossover and mutation. Crossover is the step that really powers the GA. It allows the search to fan out in diverse directions

9 R. Sarker, T. Runarsson, C. Newton / Intl. Trans. in Op. Res. 8 (2001) 61±74 69 looking for attractive solutions and permits two strings to `mate'. This may result in offspring that are ` tter' than their parents. We use two-point crossover as it is the general choice in many binary coding genetic algorithms because it minimizes disruptive effects (for more on disruptive effects see Booker (1997)). We use a heuristic crossover for real coding GAs as suggested by Fogel (1997) and Michalewicz (1996). In this crossover, the operator generates a single offspring x 3 from two parents x 1 and x 2 according to the following rule: x 3 ˆ rounded[r:(x 2 x 1 ) x 2 ], where r is a random number between 0 and 1, and the parent x 2 is no worse than x 1. Actually, when the generated value x 3 is out of bounds (upper and lower bounds), it is set equal to the corresponding violated bound. Also, note that x 1 and x 2 are `parent variables' not parents, and x 3 is rounded after crossing to integer. This is not required for binary coding. Rounding is also performed after the mutation operator. Mutation introduces random deviations into the population. For binary coding, mutation changes a 0 into a 1 and vice versa. Mutation is usually performed with low probability, otherwise it would defeat the order building being generated through selection and crossover. Mutation attempts to bump the population gently into a slightly better course. We used Nonuniform mutation for real coding GA. In this mutation, we set the system parameter b (ˆ 6r 1) that determines the degree of non-uniformity (for more see, Michalewicz et al., 1994). All GA runs have the following standard characteristics: Probability of crossover: 1.0 Probability of mutation: 1/(string length of the chromosome) Population size: 50 Number of generations in each run: 200 Number of independent runs: 300 for SGA: binary coding, two point crossover, and bit-wise mutation for RGA: heuristic crossover and nonuniform mutations. 8. Computational results The mathematical model presented in an earlier section is solved for a test problem of a single product with three raw materials using six different penalty functions based on binary and real coded GAs described earlier. The data used for the model are as follows: Sample Data: D p ˆ ; P p ˆ ; A p ˆ $50:00; H p ˆ $1:20, x P ˆ 1,000 units, S p ˆ 1:00: Raw material 1 Raw material 2 Raw material 3 A i 3,000 2,500 4,600 H i s i r i With the RHS values: Raw_S_Cap ˆ 400,000, Truck_Min_load ˆ 200,000 and Fin_S_Cap

10 70 R. Sarker, T. Runarsson, C. Newton / Intl. Trans. in Op. Res. 8 (2001) 61±74 Table 1 Results with real coded GAs Method Minimum Mean Std. Dev. Median Maximum Optimal (%) Death penalty Superiority of feasible points Dynamic penalty Annealing penalty Static penalty Adaptive penalty Table 2 Results with binary coded GAs Method Minimum Mean Std. Dev. Median Maximum Optimal (%) Death penalty Superiority of feasible points Dynamic penalty Annealing penalty Static penalty Adaptive penalty ˆ 20,000, the best solution found was: m ˆ 15, n 1 ˆ 15, n 2 ˆ 19, and n 3 ˆ 19 with TC ˆ 3:3471E5. The comparison for the six different penalty functions is shown in Table 1 for real coded GAs and in Table 2 for binary coded GAs. The minimum, presented in column 2 of both tables, means the best objective function value obtained in any of 300 runs and maximum, presented in column 6, indicates the worst value. The statistics (like mean, standard deviation and median) of the objective function values based on 300 runs are also produced to sense the variability of the target value. The far righthand columns represent the percentage of times we hit the optimal solution out of 300 runs. With tighter RHS values: Raw_S_Cap ˆ 130,000, Truck_Min_load ˆ 250,000 and Fin_S_Cap ˆ 13,000, the best solution found was: m ˆ 12, n 1 ˆ 8, n 2 ˆ 11, and n 3 ˆ 12 with TC ˆ 4:2837E5. The comparison for the six different penalty functions is shown in Table 3 for real coded GAs and in Table 4 for binary coded GAs. The results, in terms of percentage of times the optimal solution was obtained, are always better with real coded GAs as compared to binary coded GAs for all six penalty functions considered in this paper. The results are worse with tighter constrained problems. All the penalty functions, except superiority of feasible points, worked extremely well with real coded GAs for the rst test problem. However, the death-penalty method along with superiority of feasible points performed badly for tighter constrained problems, as shown in Table 3. However, all six penalty functions, with real coding GAs, provide unique optimal solutions for both test problems. These solutions are acceptable irrespective of how many times the optimal solutions are generated in a single run. The binary coded GAs failed to produce any optimal solution for the tighter test problem.

11 Table 3 Results with real coded GAs R. Sarker, T. Runarsson, C. Newton / Intl. Trans. in Op. Res. 8 (2001) 61±74 71 Method Minimum Mean Std. Dev. Median Maximum Optimal (%) Death penalty Superiority of feasible points Dynamic penalty Annealing penalty Static penalty Adaptive penalty Table 4 Results with binary coded GAs Method Minimum Mean Std. Dev. Median Maximum Optimal (%) Death penalty Superiority of feasible points Dynamic penalty Annealing penalty Static penalty Adaptive penalty The results of ve test problems are plotted in Figure 1. Problem 1 is a problem with loose constraints, and problem 5 is the problem with tightest constraints. Problems 2 to 4 are in between problems 1 and 5. The performance of superiority of feasible points and death penalty methods are consistently poor compared to other methods. Although the performances of all methods decrease with tighter % Optimum solution obtained P1 P2 P3 P4 P5 Problem number Death penalty Sup./feasible points Dynamic penalty Annealing penalty Static penalty Adaptive penalty Fig. 1. Comparison of performances for six different methods.

12 72 R. Sarker, T. Runarsson, C. Newton / Intl. Trans. in Op. Res. 8 (2001) 61±74 constraints, the performances of four methods (static, dynamic, annealing, and adaptive methods) are very close for all cases Calculating penalty coef cient Consider the commonly-used binary tournament selection. In binary tournament selection two individuals are chosen randomly from the population and the better of them is allowed to survive to the next generation. Consider this comparison between two individuals 1 and 2 transformed to take the general form from problem P2: f 1 (x) ìá 1 (x) > f 2 (x) ìá 2 (x), for a given penalty parameter ì. 0. From this equation, we can write ì < [ f 1 (x) f 2 (x))=(á 2 (x) á 1 (x)], or ì c ˆ [ f 1 (x) f 2 (x))=(á 2 (x) á 1 (x)]. Evolutionary algorithms are a population-based search method. Population information can be used to estimate the most appropriate value for ì. This value would be the one that attempts to balance the number of competitions decided by the objective and penalty function. This value may be approximated by simulating the binary tournament selection and computing for each comparison the critical penalty coef cient ì c. Then for the actual selection the ì used will be the average of these values. The results of the two test problems using the new way of calculating penalty coef cient are presented in Table 5. As we can see, the proposed method produced optimal solutions in 100% of cases for test problem 1 and 68% of cases for a tighter problem like test problem 2. Although this approach produces better solutions, in terms of the percentage of optimum solutions obtained, than any existing penalty functions based GA, it requires extensive experimentation to claim its better performance. 9. Conclusions We considered a single product±multiple raw materials joint batch-sizing problem. The mathematical programming model of this problem forms a constrained nonlinear integer program. It is dif cult to solve such a complex model using the existing algorithms. In this paper, we investigate the use of genetic algorithms (GAs) for solving such a model with one product and three raw materials. We used both binary and real coded genetic algorithms with six different penalty functions reported in the literature. We also reported a new algorithm which we developed recently. Two test problems were solved to demonstrate the use of GAs in solving BSP. The results were compared and analysed for all seven penalty functions with binary and real coded GAs. The real coded genetic algorithm works well Table 5 Performance of the proposed method Problem and Code Minimum Mean Std. Dev. Median Maximum Opt. soln. in % times P1 Real P1 Binary P2 Real P2 Binary

13 R. Sarker, T. Runarsson, C. Newton / Intl. Trans. in Op. Res. 8 (2001) 61±74 73 compared to binary coding. The new method works very well compared to the existing methods. Among the penalty functions, the death-penalty method and superiority of feasible points were found to be the worst. More experimentation is required to ensure the performance of the new method of calculating penalty coef cient. It also needs to test other types of constrained optimization problems such as models with continuous variables and mixed integer models. Some work should be performed to accelerate the convergence of the algorithm. Optimal parameter selection in GA for a class of problems would be very interesting work. Developing a set of benchmark problems considering all extreme cases to judge the performance of penalty functions based GAs would provide a challenge for the future. Acknowledgement This work is supported by UC special research grant, ADFA, University of New South Wales, Australia, awarded to Dr Ruhul Sarker. We thank Professor Xin Yao for many useful comments during the conduct of this research. References Back, T., Schwefel, H-P An Overview of Evolutionary Algorithms for Parameter Optimization, Evolutionary Computation, 1, 1±24. Bazaraa, M.S., Shetty, C.M Nonlinear Programming: Theory and Algorithms, John Wiley & Sons, New York, Chapter 9. Bean, J.C., Hadj-Alouane, A.B A Dual Genetic Algorithm for Bounded Integer Programs. Technical Report TR92-53, Ann Arbor, MI: University of Michigan, Dept. of I&OE. Booker, L Binary Strings. In Back, T., Fogel, D. and Michalewicz, Z. (Eds.), Handbook of Evolutionary Computation, Oxford University Press, Oxford, UK, C3.3:1±10. Davis, L., Handbook of Genetic Algorithms, Van Nostrand Reinhold, New York. Fogel, D Real-valued Vectors. In Back, T., Fogel, D. and Michalewicz, Z. (Eds.), Handbook of Evolutionary Computation, Oxford University Press, Oxford, UK, C3.3:11±20. Fogel, D., Stayton, L On the Effectiveness of Crossover in Simulated Evolutionary Optimization. BioSystems, 32, 171± 182. Goldberg, D Genetic Algorithms in Search, Optimization, and Machine Learning. Addison-Wesley, Reading, MA, USA. Holland, J Adaptation in Natural and Arti cial Systems. University of Michigan Press, Ann Arbor, MI, USA. Homaifar, A., Qi, C.X., Lai, S.H Constrained Optimization Vis Genetic Algorithms. Simulation, April, 242±253. Joines, J.A., Houck, C.R On the Use of Non-Stationary Penalty Functions to Solve Nonlinear Constrained Optimization Problems With GAs. Proceedings of the IEEE ICEC 1994, pp. 579±584. Khouja, M., Michalewicz, Z., Wilmot, M The Use of Genetic Algorithms to Solve the Economic Lot Size Scheduling Problem. European Journal of Operational Research, 110, 509±524. Michalewicz, Z Genetic Algorithms Numerical Optimization and Constraints, Proc.6 th Int. Conf. on Genetic Algorithms (Pittsburgh, PA, July 1995) (ed.) L.J. Eshelman (San Mateo, CA: Morgan Kaufmann), 151±158. Algorithms (Pittsburgh, PA, July 1995). Morgan Kaufmann, San Mateo, CA, 151±158. Michalewicz, Z Genetic Algorithms Data Structures ˆ Evolution Programs, 3rd edn. Springer-Verlag, New York. Michalewicz, Z., Attia, N Evolutionary Optimization of Constrained Problems, In Sebald, A.V. and Fogel, L.J. (Eds.), Proc. of the 3rd Annual Conference on Evolutionary Programming, River Edge, NJ. World Scienti c, 98±108. Michalewicz, Z., Logan, T., Swaminathan, S Evolutionary Operators for Continuous Convex Parameter Spaces, In

14 74 R. Sarker, T. Runarsson, C. Newton / Intl. Trans. in Op. Res. 8 (2001) 61±74 Sebald, A.V. and Fogel, L.J. (Eds.), Proceedings of the 3rd Annual Conference on Evolutionary Programming, River Edge, NJ. World Scienti c, 84±97. Michalewicz, Z., Schoenauer, M Evolutionary Algorithms for Constrained Parameter Optimization Problems. Evolutionary Computation 4(1), 1±32. Powell, D., Skolnick, M.M Using Genetic Algorithms in Engineering Design Optimization with Nonlinear Constraints. In Forrest, S. (Ed.), Proceedings of the 5th International Conference on Genetic Algorithms. Morgan Kaufmann, San Mateo, CA, 424±430. Runarsson, T., Sarker, R Constrained Nonlinear Integer Programming and Evolution Strategies. Proceedings of the Third Australia±Japan Joint Workshop on Intelligent and Evolutionary Systems. Canberra, November, 193±200. Sarker, B.R., Parija, G.R Optimal batch size and raw material ordering policy for a production system with a xedinterval, lumpy demand delivery system. European Journal of Operational Research, 89, 593±608. Sarker, B.R., Parija, G.R An Optimal Batch Size for a Production System Operating Under a Fixed-Quantity, Periodic Delivery Policy. Journal Operational Research Society, 45(8), 891±900. Sarker, R.A., Khan, L.R An Optimal Batch Size for a Manufacturing System Operating Under a Periodic Delivery Policy. Conference of the Asia-Paci c Operational Research Societies (APORS), Melbourne, Australia. November± December Sarker, R., Khan, L An Optimal Batch Size for a Production System Operating under Periodic Deliver Policy. Computers & Industrial Engineering, 37(4), 711±730. Sarker, R.A., Newton, C Determination of Optimal Batch Size for a Manufacturing System. In Yang, X., Mees, A., Fisher, M., and Jennings, L. (Eds.), Progress in Optimization. Kluwer Academic Publishers The Netherlands, pp. 315± 327. Sarker, R.A., Newton, C Genetic Algorithms for Solving Economic Lot Size Problems. Proceedings of the 26th International Conference on Computers and Industrial Engineering, December 15±17, Melbourne, Australia. 789±793. Sarker, R., Yao, X. (forthcoming). Simulated Annealing for Solving a Manufacturing Batch-Sizing Problem. Schwefel, H.-P Evolution and Optimum Seeking, Wiley, New York, p.16. Smith, A.E., Coit, D.W Constraint-Handling Techniques: Penalty Functions, Handbook of Evolutionary Computation, Release 97/1, IOP Publishing Ltd and Oxford University Press, C5.2:1±2:6.

Revision of a Floating-Point Genetic Algorithm GENOCOP V for Nonlinear Programming Problems

Revision of a Floating-Point Genetic Algorithm GENOCOP V for Nonlinear Programming Problems 4 The Open Cybernetics and Systemics Journal, 008,, 4-9 Revision of a Floating-Point Genetic Algorithm GENOCOP V for Nonlinear Programming Problems K. Kato *, M. Sakawa and H. Katagiri Department of Artificial

More information

Genetic Algorithms, Numerical Optimization, and Constraints. Zbigniew Michalewicz. Department of Computer Science. University of North Carolina

Genetic Algorithms, Numerical Optimization, and Constraints. Zbigniew Michalewicz. Department of Computer Science. University of North Carolina Genetic Algorithms, Numerical Optimization, and Constraints Zbigniew Michalewicz Department of Computer Science University of North Carolina Charlotte, NC 28223 Abstract During the last two years several

More information

Telecommunication and Informatics University of North Carolina, Technical University of Gdansk Charlotte, NC 28223, USA

Telecommunication and Informatics University of North Carolina, Technical University of Gdansk Charlotte, NC 28223, USA A Decoder-based Evolutionary Algorithm for Constrained Parameter Optimization Problems S lawomir Kozie l 1 and Zbigniew Michalewicz 2 1 Department of Electronics, 2 Department of Computer Science, Telecommunication

More information

ISSN: [Keswani* et al., 7(1): January, 2018] Impact Factor: 4.116

ISSN: [Keswani* et al., 7(1): January, 2018] Impact Factor: 4.116 IJESRT INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY AUTOMATIC TEST CASE GENERATION FOR PERFORMANCE ENHANCEMENT OF SOFTWARE THROUGH GENETIC ALGORITHM AND RANDOM TESTING Bright Keswani,

More information

MINIMAL EDGE-ORDERED SPANNING TREES USING A SELF-ADAPTING GENETIC ALGORITHM WITH MULTIPLE GENOMIC REPRESENTATIONS

MINIMAL EDGE-ORDERED SPANNING TREES USING A SELF-ADAPTING GENETIC ALGORITHM WITH MULTIPLE GENOMIC REPRESENTATIONS Proceedings of Student/Faculty Research Day, CSIS, Pace University, May 5 th, 2006 MINIMAL EDGE-ORDERED SPANNING TREES USING A SELF-ADAPTING GENETIC ALGORITHM WITH MULTIPLE GENOMIC REPRESENTATIONS Richard

More information

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization 2017 2 nd International Electrical Engineering Conference (IEEC 2017) May. 19 th -20 th, 2017 at IEP Centre, Karachi, Pakistan Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic

More information

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 20 CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 2.1 CLASSIFICATION OF CONVENTIONAL TECHNIQUES Classical optimization methods can be classified into two distinct groups:

More information

Your Brains and My Beauty: Department of Computer Science. Charlotte, NC 28223, USA. themselves. Apart from static penalties [3], there

Your Brains and My Beauty: Department of Computer Science. Charlotte, NC 28223, USA. themselves. Apart from static penalties [3], there Your Brains and My Beauty: Parent Matching for Constrained Optimisation Robert Hinterding Dept. of Comp. and Math. Sciences Victoria University of Technology PO Box 14428 MCMC, Melbourne 8001, Australia

More information

The Genetic Algorithm for finding the maxima of single-variable functions

The Genetic Algorithm for finding the maxima of single-variable functions Research Inventy: International Journal Of Engineering And Science Vol.4, Issue 3(March 2014), PP 46-54 Issn (e): 2278-4721, Issn (p):2319-6483, www.researchinventy.com The Genetic Algorithm for finding

More information

Adaptive Crossover in Genetic Algorithms Using Statistics Mechanism

Adaptive Crossover in Genetic Algorithms Using Statistics Mechanism in Artificial Life VIII, Standish, Abbass, Bedau (eds)(mit Press) 2002. pp 182 185 1 Adaptive Crossover in Genetic Algorithms Using Statistics Mechanism Shengxiang Yang Department of Mathematics and Computer

More information

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM Journal of Al-Nahrain University Vol.10(2), December, 2007, pp.172-177 Science GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM * Azhar W. Hammad, ** Dr. Ban N. Thannoon Al-Nahrain

More information

Metaheuristic Optimization with Evolver, Genocop and OptQuest

Metaheuristic Optimization with Evolver, Genocop and OptQuest Metaheuristic Optimization with Evolver, Genocop and OptQuest MANUEL LAGUNA Graduate School of Business Administration University of Colorado, Boulder, CO 80309-0419 Manuel.Laguna@Colorado.EDU Last revision:

More information

Balancing Survival of Feasible and Infeasible Solutions in Evolutionary Optimization Algorithms

Balancing Survival of Feasible and Infeasible Solutions in Evolutionary Optimization Algorithms Balancing Survival of Feasible and Infeasible Solutions in Evolutionary Optimization Algorithms Zhichao Lu,, Kalyanmoy Deb, and Hemant Singh Electrical and Computer Engineering Michigan State University,

More information

V.Petridis, S. Kazarlis and A. Papaikonomou

V.Petridis, S. Kazarlis and A. Papaikonomou Proceedings of IJCNN 93, p.p. 276-279, Oct. 993, Nagoya, Japan. A GENETIC ALGORITHM FOR TRAINING RECURRENT NEURAL NETWORKS V.Petridis, S. Kazarlis and A. Papaikonomou Dept. of Electrical Eng. Faculty of

More information

Solving A Nonlinear Side Constrained Transportation Problem. by Using Spanning Tree-based Genetic Algorithm. with Fuzzy Logic Controller

Solving A Nonlinear Side Constrained Transportation Problem. by Using Spanning Tree-based Genetic Algorithm. with Fuzzy Logic Controller Solving A Nonlinear Side Constrained Transportation Problem by Using Spanning Tree-based Genetic Algorithm with Fuzzy Logic Controller Yasuhiro Tsujimura *, Mitsuo Gen ** and Admi Syarif **,*** * Department

More information

CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS

CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS 6.1 Introduction Gradient-based algorithms have some weaknesses relative to engineering optimization. Specifically, it is difficult to use gradient-based algorithms

More information

Binary Representations of Integers and the Performance of Selectorecombinative Genetic Algorithms

Binary Representations of Integers and the Performance of Selectorecombinative Genetic Algorithms Binary Representations of Integers and the Performance of Selectorecombinative Genetic Algorithms Franz Rothlauf Department of Information Systems University of Bayreuth, Germany franz.rothlauf@uni-bayreuth.de

More information

search [3], and more recently has been applied to a variety of problem settings in combinatorial and nonlinear optimization [4]. It seems that the evo

search [3], and more recently has been applied to a variety of problem settings in combinatorial and nonlinear optimization [4]. It seems that the evo Evolutionary Computation at the Edge of Feasibility Marc Schoenauer 1 and Zbigniew Michalewicz 2 1 CMAP { URA CNRS 756 2 Department of Computer Science, Ecole Polytechnique University of North Carolina,

More information

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS NABEEL AL-MILLI Financial and Business Administration and Computer Science Department Zarqa University College Al-Balqa'

More information

Genetic algorithms and finite element coupling for mechanical optimization

Genetic algorithms and finite element coupling for mechanical optimization Computer Aided Optimum Design in Engineering X 87 Genetic algorithms and finite element coupling for mechanical optimization G. Corriveau, R. Guilbault & A. Tahan Department of Mechanical Engineering,

More information

Heuristic Optimisation

Heuristic Optimisation Heuristic Optimisation Part 10: Genetic Algorithm Basics Sándor Zoltán Németh http://web.mat.bham.ac.uk/s.z.nemeth s.nemeth@bham.ac.uk University of Birmingham S Z Németh (s.nemeth@bham.ac.uk) Heuristic

More information

An Efficient Constraint Handling Method for Genetic Algorithms

An Efficient Constraint Handling Method for Genetic Algorithms An Efficient Constraint Handling Method for Genetic Algorithms Kalyanmoy Deb Kanpur Genetic Algorithms Laboratory (KanGAL) Department of Mechanical Engineering Indian Institute of Technology Kanpur Kanpur,

More information

Evolutionary Algorithms and the Cardinality Constrained Portfolio Optimization Problem

Evolutionary Algorithms and the Cardinality Constrained Portfolio Optimization Problem Evolutionary Algorithms and the Cardinality Constrained Portfolio Optimization Problem Felix Streichert, Holger Ulmer, and Andreas Zell Center for Bioinformatics Tübingen (ZBIT), University of Tübingen,

More information

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation Optimization Methods: Introduction and Basic concepts 1 Module 1 Lecture Notes 2 Optimization Problem and Model Formulation Introduction In the previous lecture we studied the evolution of optimization

More information

REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION. Nedim TUTKUN

REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION. Nedim TUTKUN REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION Nedim TUTKUN nedimtutkun@gmail.com Outlines Unconstrained Optimization Ackley s Function GA Approach for Ackley s Function Nonlinear Programming Penalty

More information

Calc Redirection : A Structure for Direction Finding Aided Traffic Monitoring

Calc Redirection : A Structure for Direction Finding Aided Traffic Monitoring Calc Redirection : A Structure for Direction Finding Aided Traffic Monitoring Paparao Sanapathi MVGR College of engineering vizianagaram, AP P. Satheesh, M. Tech,Ph. D MVGR College of engineering vizianagaram,

More information

Network Routing Protocol using Genetic Algorithms

Network Routing Protocol using Genetic Algorithms International Journal of Electrical & Computer Sciences IJECS-IJENS Vol:0 No:02 40 Network Routing Protocol using Genetic Algorithms Gihan Nagib and Wahied G. Ali Abstract This paper aims to develop a

More information

Towards Automatic Recognition of Fonts using Genetic Approach

Towards Automatic Recognition of Fonts using Genetic Approach Towards Automatic Recognition of Fonts using Genetic Approach M. SARFRAZ Department of Information and Computer Science King Fahd University of Petroleum and Minerals KFUPM # 1510, Dhahran 31261, Saudi

More information

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2 Chapter 5 A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2 Graph Matching has attracted the exploration of applying new computing paradigms because of the large number of applications

More information

An Introduction to Evolutionary Algorithms

An Introduction to Evolutionary Algorithms An Introduction to Evolutionary Algorithms Karthik Sindhya, PhD Postdoctoral Researcher Industrial Optimization Group Department of Mathematical Information Technology Karthik.sindhya@jyu.fi http://users.jyu.fi/~kasindhy/

More information

A HYBRID APPROACH IN GENETIC ALGORITHM: COEVOLUTION OF THREE VECTOR SOLUTION ENCODING. A CASE-STUDY

A HYBRID APPROACH IN GENETIC ALGORITHM: COEVOLUTION OF THREE VECTOR SOLUTION ENCODING. A CASE-STUDY A HYBRID APPROACH IN GENETIC ALGORITHM: COEVOLUTION OF THREE VECTOR SOLUTION ENCODING. A CASE-STUDY Dmitriy BORODIN, Victor GORELIK, Wim DE BRUYN and Bert VAN VRECKEM University College Ghent, Ghent, Belgium

More information

Genetic Algorithm Performance with Different Selection Methods in Solving Multi-Objective Network Design Problem

Genetic Algorithm Performance with Different Selection Methods in Solving Multi-Objective Network Design Problem etic Algorithm Performance with Different Selection Methods in Solving Multi-Objective Network Design Problem R. O. Oladele Department of Computer Science University of Ilorin P.M.B. 1515, Ilorin, NIGERIA

More information

A GENETIC ALGORITHM APPROACH TO OPTIMAL TOPOLOGICAL DESIGN OF ALL TERMINAL NETWORKS

A GENETIC ALGORITHM APPROACH TO OPTIMAL TOPOLOGICAL DESIGN OF ALL TERMINAL NETWORKS A GENETIC ALGORITHM APPROACH TO OPTIMAL TOPOLOGICAL DESIGN OF ALL TERMINAL NETWORKS BERNA DENGIZ AND FULYA ALTIPARMAK Department of Industrial Engineering Gazi University, Ankara, TURKEY 06570 ALICE E.

More information

Automata Construct with Genetic Algorithm

Automata Construct with Genetic Algorithm Automata Construct with Genetic Algorithm Vít Fábera Department of Informatics and Telecommunication, Faculty of Transportation Sciences, Czech Technical University, Konviktská 2, Praha, Czech Republic,

More information

Optimization of Constrained Function Using Genetic Algorithm

Optimization of Constrained Function Using Genetic Algorithm Optimization of Constrained Function Using Genetic Algorithm Afaq Alam Khan 1* Roohie Naaz Mir 2 1. Department of Information Technology, Central University of Kashmir 2. Department of Computer Science

More information

A Population-Based Learning Algorithm Which Learns Both. Architectures and Weights of Neural Networks y. Yong Liu and Xin Yao

A Population-Based Learning Algorithm Which Learns Both. Architectures and Weights of Neural Networks y. Yong Liu and Xin Yao A Population-Based Learning Algorithm Which Learns Both Architectures and Weights of Neural Networks y Yong Liu and Xin Yao Computational Intelligence Group Department of Computer Science University College,

More information

IMPROVING A GREEDY DNA MOTIF SEARCH USING A MULTIPLE GENOMIC SELF-ADAPTATING GENETIC ALGORITHM

IMPROVING A GREEDY DNA MOTIF SEARCH USING A MULTIPLE GENOMIC SELF-ADAPTATING GENETIC ALGORITHM Proceedings of Student/Faculty Research Day, CSIS, Pace University, May 4th, 2007 IMPROVING A GREEDY DNA MOTIF SEARCH USING A MULTIPLE GENOMIC SELF-ADAPTATING GENETIC ALGORITHM Michael L. Gargano, mgargano@pace.edu

More information

The Simple Genetic Algorithm Performance: A Comparative Study on the Operators Combination

The Simple Genetic Algorithm Performance: A Comparative Study on the Operators Combination INFOCOMP 20 : The First International Conference on Advanced Communications and Computation The Simple Genetic Algorithm Performance: A Comparative Study on the Operators Combination Delmar Broglio Carvalho,

More information

International Journal of Digital Application & Contemporary research Website: (Volume 1, Issue 7, February 2013)

International Journal of Digital Application & Contemporary research Website:   (Volume 1, Issue 7, February 2013) Performance Analysis of GA and PSO over Economic Load Dispatch Problem Sakshi Rajpoot sakshirajpoot1988@gmail.com Dr. Sandeep Bhongade sandeepbhongade@rediffmail.com Abstract Economic Load dispatch problem

More information

IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 5, NO. 1, FEBRUARY

IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 5, NO. 1, FEBRUARY IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 5, NO. 1, FEBRUARY 2001 41 Brief Papers An Orthogonal Genetic Algorithm with Quantization for Global Numerical Optimization Yiu-Wing Leung, Senior Member,

More information

A Genetic Algorithm for the Multiple Knapsack Problem in Dynamic Environment

A Genetic Algorithm for the Multiple Knapsack Problem in Dynamic Environment , 23-25 October, 2013, San Francisco, USA A Genetic Algorithm for the Multiple Knapsack Problem in Dynamic Environment Ali Nadi Ünal Abstract The 0/1 Multiple Knapsack Problem is an important class of

More information

AIRFOIL SHAPE OPTIMIZATION USING EVOLUTIONARY ALGORITHMS

AIRFOIL SHAPE OPTIMIZATION USING EVOLUTIONARY ALGORITHMS AIRFOIL SHAPE OPTIMIZATION USING EVOLUTIONARY ALGORITHMS Emre Alpman Graduate Research Assistant Aerospace Engineering Department Pennstate University University Park, PA, 6802 Abstract A new methodology

More information

An Adaptive Normalization based Constrained Handling Methodology with Hybrid Bi-Objective and Penalty Function Approach

An Adaptive Normalization based Constrained Handling Methodology with Hybrid Bi-Objective and Penalty Function Approach An Adaptive Normalization based Constrained Handling Methodology with Hybrid Bi-Objective and Penalty Function Approach Rituparna Datta and Kalyanmoy Deb Department of Mechanical Engineering, Indian Institute

More information

Topological Machining Fixture Layout Synthesis Using Genetic Algorithms

Topological Machining Fixture Layout Synthesis Using Genetic Algorithms Topological Machining Fixture Layout Synthesis Using Genetic Algorithms Necmettin Kaya Uludag University, Mechanical Eng. Department, Bursa, Turkey Ferruh Öztürk Uludag University, Mechanical Eng. Department,

More information

It is a common practice to replace the equations h j (~x) = 0 by a set of inequalities h j (~x)» ffi and h j (~x) ffi for some small ffi > 0. In the r

It is a common practice to replace the equations h j (~x) = 0 by a set of inequalities h j (~x)» ffi and h j (~x) ffi for some small ffi > 0. In the r Evolutionary Algorithms, Homomorphous Mappings, and Constrained Parameter Optimization Sψlawomir Kozieψl Λ and Zbigniew Michalewicz y Abstract During the last five years, several methods have been proposed

More information

336 THE STATISTICAL SOFTWARE NEWSLETTER where z is one (randomly taken) pole of the simplex S, g the centroid of the remaining d poles of the simplex

336 THE STATISTICAL SOFTWARE NEWSLETTER where z is one (randomly taken) pole of the simplex S, g the centroid of the remaining d poles of the simplex THE STATISTICAL SOFTWARE NEWSLETTER 335 Simple Evolutionary Heuristics for Global Optimization Josef Tvrdk and Ivan Krivy University of Ostrava, Brafova 7, 701 03 Ostrava, Czech Republic Phone: +420.69.6160

More information

A SIMULATED ANNEALING ALGORITHM FOR SOME CLASS OF DISCRETE-CONTINUOUS SCHEDULING PROBLEMS. Joanna Józefowska, Marek Mika and Jan Węglarz

A SIMULATED ANNEALING ALGORITHM FOR SOME CLASS OF DISCRETE-CONTINUOUS SCHEDULING PROBLEMS. Joanna Józefowska, Marek Mika and Jan Węglarz A SIMULATED ANNEALING ALGORITHM FOR SOME CLASS OF DISCRETE-CONTINUOUS SCHEDULING PROBLEMS Joanna Józefowska, Marek Mika and Jan Węglarz Poznań University of Technology, Institute of Computing Science,

More information

Time Complexity Analysis of the Genetic Algorithm Clustering Method

Time Complexity Analysis of the Genetic Algorithm Clustering Method Time Complexity Analysis of the Genetic Algorithm Clustering Method Z. M. NOPIAH, M. I. KHAIRIR, S. ABDULLAH, M. N. BAHARIN, and A. ARIFIN Department of Mechanical and Materials Engineering Universiti

More information

Real-Coded Evolutionary Approaches to Unconstrained Numerical Optimization

Real-Coded Evolutionary Approaches to Unconstrained Numerical Optimization Real-Coded Evolutionary Approaches to Unconstrained Numerical Optimization Alexandre C. M. de OLIVEIRA DEINF/UFMA Av. dos Portugueses, s/n, Campus do Bacanga, S. Luíz MA, Brazil. acmo@deinf.ufma.br Luiz

More information

A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery

A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery Monika Sharma 1, Deepak Sharma 2 1 Research Scholar Department of Computer Science and Engineering, NNSS SGI Samalkha,

More information

Learning Adaptive Parameters with Restricted Genetic Optimization Method

Learning Adaptive Parameters with Restricted Genetic Optimization Method Learning Adaptive Parameters with Restricted Genetic Optimization Method Santiago Garrido and Luis Moreno Universidad Carlos III de Madrid, Leganés 28911, Madrid (Spain) Abstract. Mechanisms for adapting

More information

Genetic Algorithm based Fractal Image Compression

Genetic Algorithm based Fractal Image Compression Vol.3, Issue.2, March-April. 2013 pp-1123-1128 ISSN: 2249-6645 Genetic Algorithm based Fractal Image Compression Mahesh G. Huddar Lecturer, Dept. of CSE,Hirasugar Institute of Technology, Nidasoshi, India

More information

Artificial Intelligence Application (Genetic Algorithm)

Artificial Intelligence Application (Genetic Algorithm) Babylon University College of Information Technology Software Department Artificial Intelligence Application (Genetic Algorithm) By Dr. Asaad Sabah Hadi 2014-2015 EVOLUTIONARY ALGORITHM The main idea about

More information

Impact of Unique Schemata Count on the Fitness Value in Genetic Algorithms

Impact of Unique Schemata Count on the Fitness Value in Genetic Algorithms Impact of Unique Schemata Count on the Fitness Value in Genetic Algorithms 1 2 Rajneesh Pawar, Dr. J.S. Saini Assistant Professor, Deptt. of Electrical Engg., D.C.R. Univ. of Science & Tech., Murthal,

More information

What is GOSET? GOSET stands for Genetic Optimization System Engineering Tool

What is GOSET? GOSET stands for Genetic Optimization System Engineering Tool Lecture 5: GOSET 1 What is GOSET? GOSET stands for Genetic Optimization System Engineering Tool GOSET is a MATLAB based genetic algorithm toolbox for solving optimization problems 2 GOSET Features Wide

More information

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you?

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you? Gurjit Randhawa Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you? This would be nice! Can it be done? A blind generate

More information

Introduction to Optimization

Introduction to Optimization Introduction to Optimization Approximation Algorithms and Heuristics November 6, 2015 École Centrale Paris, Châtenay-Malabry, France Dimo Brockhoff INRIA Lille Nord Europe 2 Exercise: The Knapsack Problem

More information

Multi-Objective Pipe Smoothing Genetic Algorithm For Water Distribution Network Design

Multi-Objective Pipe Smoothing Genetic Algorithm For Water Distribution Network Design City University of New York (CUNY) CUNY Academic Works International Conference on Hydroinformatics 8-1-2014 Multi-Objective Pipe Smoothing Genetic Algorithm For Water Distribution Network Design Matthew

More information

An Application of Genetic Algorithm for Auto-body Panel Die-design Case Library Based on Grid

An Application of Genetic Algorithm for Auto-body Panel Die-design Case Library Based on Grid An Application of Genetic Algorithm for Auto-body Panel Die-design Case Library Based on Grid Demin Wang 2, Hong Zhu 1, and Xin Liu 2 1 College of Computer Science and Technology, Jilin University, Changchun

More information

Using Genetic Algorithms in Integer Programming for Decision Support

Using Genetic Algorithms in Integer Programming for Decision Support Doi:10.5901/ajis.2014.v3n6p11 Abstract Using Genetic Algorithms in Integer Programming for Decision Support Dr. Youcef Souar Omar Mouffok Taher Moulay University Saida, Algeria Email:Syoucef12@yahoo.fr

More information

MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS

MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS In: Journal of Applied Statistical Science Volume 18, Number 3, pp. 1 7 ISSN: 1067-5817 c 2011 Nova Science Publishers, Inc. MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS Füsun Akman

More information

Reducing Graphic Conflict In Scale Reduced Maps Using A Genetic Algorithm

Reducing Graphic Conflict In Scale Reduced Maps Using A Genetic Algorithm Reducing Graphic Conflict In Scale Reduced Maps Using A Genetic Algorithm Dr. Ian D. Wilson School of Technology, University of Glamorgan, Pontypridd CF37 1DL, UK Dr. J. Mark Ware School of Computing,

More information

Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems

Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems Dervis Karaboga and Bahriye Basturk Erciyes University, Engineering Faculty, The Department of Computer

More information

ANTICIPATORY VERSUS TRADITIONAL GENETIC ALGORITHM

ANTICIPATORY VERSUS TRADITIONAL GENETIC ALGORITHM Anticipatory Versus Traditional Genetic Algorithm ANTICIPATORY VERSUS TRADITIONAL GENETIC ALGORITHM ABSTRACT Irina Mocanu 1 Eugenia Kalisz 2 This paper evaluates the performances of a new type of genetic

More information

1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra

1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra Pattern Recall Analysis of the Hopfield Neural Network with a Genetic Algorithm Susmita Mohapatra Department of Computer Science, Utkal University, India Abstract: This paper is focused on the implementation

More information

Multi-objective Optimization

Multi-objective Optimization Jugal K. Kalita Single vs. Single vs. Single Objective Optimization: When an optimization problem involves only one objective function, the task of finding the optimal solution is called single-objective

More information

Introduction to Optimization

Introduction to Optimization Introduction to Optimization Approximation Algorithms and Heuristics November 21, 2016 École Centrale Paris, Châtenay-Malabry, France Dimo Brockhoff Inria Saclay Ile-de-France 2 Exercise: The Knapsack

More information

Crew Scheduling Problem: A Column Generation Approach Improved by a Genetic Algorithm. Santos and Mateus (2007)

Crew Scheduling Problem: A Column Generation Approach Improved by a Genetic Algorithm. Santos and Mateus (2007) In the name of God Crew Scheduling Problem: A Column Generation Approach Improved by a Genetic Algorithm Spring 2009 Instructor: Dr. Masoud Yaghini Outlines Problem Definition Modeling As A Set Partitioning

More information

A New Selection Operator - CSM in Genetic Algorithms for Solving the TSP

A New Selection Operator - CSM in Genetic Algorithms for Solving the TSP A New Selection Operator - CSM in Genetic Algorithms for Solving the TSP Wael Raef Alkhayri Fahed Al duwairi High School Aljabereyah, Kuwait Suhail Sami Owais Applied Science Private University Amman,

More information

A Web-Based Evolutionary Algorithm Demonstration using the Traveling Salesman Problem

A Web-Based Evolutionary Algorithm Demonstration using the Traveling Salesman Problem A Web-Based Evolutionary Algorithm Demonstration using the Traveling Salesman Problem Richard E. Mowe Department of Statistics St. Cloud State University mowe@stcloudstate.edu Bryant A. Julstrom Department

More information

Dept. of Computer Science. The eld of time series analysis and forecasting methods has signicantly changed in the last

Dept. of Computer Science. The eld of time series analysis and forecasting methods has signicantly changed in the last Model Identication and Parameter Estimation of ARMA Models by Means of Evolutionary Algorithms Susanne Rolf Dept. of Statistics University of Dortmund Germany Joachim Sprave y Dept. of Computer Science

More information

An Evolutionary Algorithm for the Multi-objective Shortest Path Problem

An Evolutionary Algorithm for the Multi-objective Shortest Path Problem An Evolutionary Algorithm for the Multi-objective Shortest Path Problem Fangguo He Huan Qi Qiong Fan Institute of Systems Engineering, Huazhong University of Science & Technology, Wuhan 430074, P. R. China

More information

An empirical study on GAs without parameters

An empirical study on GAs without parameters An empirical study on GAs without parameters Th. Bäck, and A.E. Eiben,3 and N.A.L. van der Vaart Leiden University, ICD Dortmund, 3 Free University Amsterdam Abstract. In this paper we implement GAs that

More information

THE Multiconstrained 0 1 Knapsack Problem (MKP) is

THE Multiconstrained 0 1 Knapsack Problem (MKP) is An Improved Genetic Algorithm for the Multiconstrained 0 1 Knapsack Problem Günther R. Raidl Abstract This paper presents an improved hybrid Genetic Algorithm (GA) for solving the Multiconstrained 0 1

More information

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search A JOB-SHOP SCHEDULING PROBLEM (JSSP) USING GENETIC ALGORITHM (GA) Mahanim Omar, Adam Baharum, Yahya Abu Hasan School of Mathematical Sciences, Universiti Sains Malaysia 11800 Penang, Malaysia Tel: (+)

More information

Using Genetic Algorithms to optimize ACS-TSP

Using Genetic Algorithms to optimize ACS-TSP Using Genetic Algorithms to optimize ACS-TSP Marcin L. Pilat and Tony White School of Computer Science, Carleton University, 1125 Colonel By Drive, Ottawa, ON, K1S 5B6, Canada {mpilat,arpwhite}@scs.carleton.ca

More information

Using Genetic Algorithm with Triple Crossover to Solve Travelling Salesman Problem

Using Genetic Algorithm with Triple Crossover to Solve Travelling Salesman Problem Proc. 1 st International Conference on Machine Learning and Data Engineering (icmlde2017) 20-22 Nov 2017, Sydney, Australia ISBN: 978-0-6480147-3-7 Using Genetic Algorithm with Triple Crossover to Solve

More information

Inducing Parameters of a Decision Tree for Expert System Shell McESE by Genetic Algorithm

Inducing Parameters of a Decision Tree for Expert System Shell McESE by Genetic Algorithm Inducing Parameters of a Decision Tree for Expert System Shell McESE by Genetic Algorithm I. Bruha and F. Franek Dept of Computing & Software, McMaster University Hamilton, Ont., Canada, L8S4K1 Email:

More information

CHAPTER 8 DISCUSSIONS

CHAPTER 8 DISCUSSIONS 153 CHAPTER 8 DISCUSSIONS This chapter discusses the developed models, methodologies to solve the developed models, performance of the developed methodologies and their inferences. 8.1 MULTI-PERIOD FIXED

More information

Multi-Objective Optimization Using Genetic Algorithms

Multi-Objective Optimization Using Genetic Algorithms Multi-Objective Optimization Using Genetic Algorithms Mikhail Gaerlan Computational Physics PH 4433 December 8, 2015 1 Optimization Optimization is a general term for a type of numerical problem that involves

More information

DETERMINING MAXIMUM/MINIMUM VALUES FOR TWO- DIMENTIONAL MATHMATICLE FUNCTIONS USING RANDOM CREOSSOVER TECHNIQUES

DETERMINING MAXIMUM/MINIMUM VALUES FOR TWO- DIMENTIONAL MATHMATICLE FUNCTIONS USING RANDOM CREOSSOVER TECHNIQUES DETERMINING MAXIMUM/MINIMUM VALUES FOR TWO- DIMENTIONAL MATHMATICLE FUNCTIONS USING RANDOM CREOSSOVER TECHNIQUES SHIHADEH ALQRAINY. Department of Software Engineering, Albalqa Applied University. E-mail:

More information

Sci.Int.(Lahore),28(1), ,2016 ISSN ; CODEN: SINTE 8 201

Sci.Int.(Lahore),28(1), ,2016 ISSN ; CODEN: SINTE 8 201 Sci.Int.(Lahore),28(1),201-209,2016 ISSN 1013-5316; CODEN: SINTE 8 201 A NOVEL PLANT PROPAGATION ALGORITHM: MODIFICATIONS AND IMPLEMENTATION Muhammad Sulaiman 1, Abdel Salhi 2, Eric S Fraga 3, Wali Khan

More information

Four Methods for Maintenance Scheduling

Four Methods for Maintenance Scheduling Four Methods for Maintenance Scheduling Edmund K. Burke, University of Nottingham, ekb@cs.nott.ac.uk John A. Clark, University of York, jac@minster.york.ac.uk Alistair J. Smith, University of Nottingham,

More information

A NEW APPROACH TO SOLVE ECONOMIC LOAD DISPATCH USING PARTICLE SWARM OPTIMIZATION

A NEW APPROACH TO SOLVE ECONOMIC LOAD DISPATCH USING PARTICLE SWARM OPTIMIZATION A NEW APPROACH TO SOLVE ECONOMIC LOAD DISPATCH USING PARTICLE SWARM OPTIMIZATION Manjeet Singh 1, Divesh Thareja 2 1 Department of Electrical and Electronics Engineering, Assistant Professor, HCTM Technical

More information

Selection of Optimal Path in Routing Using Genetic Algorithm

Selection of Optimal Path in Routing Using Genetic Algorithm Selection of Optimal Path in Routing Using Genetic Algorithm Sachin Kumar Department of Computer Science and Applications CH. Devi Lal University, Sirsa, Haryana Avninder Singh Department of Computer Science

More information

MIC 2009: The VIII Metaheuristics International Conference. A Comparative Study of Adaptive Mutation Operators for Genetic Algorithms

MIC 2009: The VIII Metaheuristics International Conference. A Comparative Study of Adaptive Mutation Operators for Genetic Algorithms : The VIII Metaheuristics International Conference id-1 A Comparative Study of Adaptive Mutation Operators for Genetic Algorithms Imtiaz Korejo, Shengxiang Yang, and ChangheLi Department of Computer Science,

More information

A COMPARATIVE STUDY OF FIVE PARALLEL GENETIC ALGORITHMS USING THE TRAVELING SALESMAN PROBLEM

A COMPARATIVE STUDY OF FIVE PARALLEL GENETIC ALGORITHMS USING THE TRAVELING SALESMAN PROBLEM A COMPARATIVE STUDY OF FIVE PARALLEL GENETIC ALGORITHMS USING THE TRAVELING SALESMAN PROBLEM Lee Wang, Anthony A. Maciejewski, Howard Jay Siegel, and Vwani P. Roychowdhury * Microsoft Corporation Parallel

More information

Genetic Algorithms for Real Parameter Optimization

Genetic Algorithms for Real Parameter Optimization Genetic Algorithms for Real Parameter Optimization Alden H. Wright Department of Computer Science University of Montana Missoula, Montana 59812 Abstract This paper is concerned with the application of

More information

Escaping Local Optima: Genetic Algorithm

Escaping Local Optima: Genetic Algorithm Artificial Intelligence Escaping Local Optima: Genetic Algorithm Dae-Won Kim School of Computer Science & Engineering Chung-Ang University We re trying to escape local optima To achieve this, we have learned

More information

Search Space Boundary Extension Method in Real-Coded Genetic Algorithms

Search Space Boundary Extension Method in Real-Coded Genetic Algorithms Information Sciences, Vol. 33/3-4, pp. 9-47 (00.5) Search Space Boundary Extension Method in Real-Coded Genetic Algorithms Shigeyoshi Tsutsui* and David E. Goldberg** * Department of Management and Information

More information

Hybridization EVOLUTIONARY COMPUTING. Reasons for Hybridization - 1. Naming. Reasons for Hybridization - 3. Reasons for Hybridization - 2

Hybridization EVOLUTIONARY COMPUTING. Reasons for Hybridization - 1. Naming. Reasons for Hybridization - 3. Reasons for Hybridization - 2 Hybridization EVOLUTIONARY COMPUTING Hybrid Evolutionary Algorithms hybridization of an EA with local search techniques (commonly called memetic algorithms) EA+LS=MA constructive heuristics exact methods

More information

Genetic Algorithms For Vertex. Splitting in DAGs 1

Genetic Algorithms For Vertex. Splitting in DAGs 1 Genetic Algorithms For Vertex Splitting in DAGs 1 Matthias Mayer 2 and Fikret Ercal 3 CSC-93-02 Fri Jan 29 1993 Department of Computer Science University of Missouri-Rolla Rolla, MO 65401, U.S.A. (314)

More information

DE/EDA: A New Evolutionary Algorithm for Global Optimization 1

DE/EDA: A New Evolutionary Algorithm for Global Optimization 1 DE/EDA: A New Evolutionary Algorithm for Global Optimization 1 Jianyong Sun, Qingfu Zhang and Edward P.K. Tsang Department of Computer Science, University of Essex, Wivenhoe Park, Colchester, CO4 3SQ,

More information

SELF-ADAPTATION IN GENETIC ALGORITHMS USING MULTIPLE GENOMIC REDUNDANT REPRESENTATIONS ABSTRACT

SELF-ADAPTATION IN GENETIC ALGORITHMS USING MULTIPLE GENOMIC REDUNDANT REPRESENTATIONS ABSTRACT Proceedings of Student/Faculty Research Day, CSIS, Pace University, May 7th, 2004 SELF-ADAPTATION IN GENETIC ALGORITHMS USING MULTIPLE GENOMIC REDUNDANT REPRESENTATIONS Maheswara Prasad Kasinadhuni, Michael

More information

Abstract. 1 Introduction

Abstract. 1 Introduction Shape optimal design using GA and BEM Eisuke Kita & Hisashi Tanie Department of Mechano-Informatics and Systems, Nagoya University, Nagoya 464-01, Japan Abstract This paper describes a shape optimization

More information

March 19, Heuristics for Optimization. Outline. Problem formulation. Genetic algorithms

March 19, Heuristics for Optimization. Outline. Problem formulation. Genetic algorithms Olga Galinina olga.galinina@tut.fi ELT-53656 Network Analysis and Dimensioning II Department of Electronics and Communications Engineering Tampere University of Technology, Tampere, Finland March 19, 2014

More information

Surrogate Gradient Algorithm for Lagrangian Relaxation 1,2

Surrogate Gradient Algorithm for Lagrangian Relaxation 1,2 Surrogate Gradient Algorithm for Lagrangian Relaxation 1,2 X. Zhao 3, P. B. Luh 4, and J. Wang 5 Communicated by W.B. Gong and D. D. Yao 1 This paper is dedicated to Professor Yu-Chi Ho for his 65th birthday.

More information

Solving Sudoku Puzzles with Node Based Coincidence Algorithm

Solving Sudoku Puzzles with Node Based Coincidence Algorithm Solving Sudoku Puzzles with Node Based Coincidence Algorithm Kiatsopon Waiyapara Department of Compute Engineering, Faculty of Engineering, Chulalongkorn University, Bangkok, Thailand kiatsopon.w@gmail.com

More information

Genetic Algorithm for Dynamic Capacitated Minimum Spanning Tree

Genetic Algorithm for Dynamic Capacitated Minimum Spanning Tree 28 Genetic Algorithm for Dynamic Capacitated Minimum Spanning Tree 1 Tanu Gupta, 2 Anil Kumar 1 Research Scholar, IFTM, University, Moradabad, India. 2 Sr. Lecturer, KIMT, Moradabad, India. Abstract Many

More information

CONCEPT FORMATION AND DECISION TREE INDUCTION USING THE GENETIC PROGRAMMING PARADIGM

CONCEPT FORMATION AND DECISION TREE INDUCTION USING THE GENETIC PROGRAMMING PARADIGM 1 CONCEPT FORMATION AND DECISION TREE INDUCTION USING THE GENETIC PROGRAMMING PARADIGM John R. Koza Computer Science Department Stanford University Stanford, California 94305 USA E-MAIL: Koza@Sunburn.Stanford.Edu

More information