Model specification search using a genetic algorithm with factor reordering for a simple structure factor analysis model

Size: px
Start display at page:

Download "Model specification search using a genetic algorithm with factor reordering for a simple structure factor analysis model"

Transcription

1 Japanese Psychological Research 2007, Volume 49, No. 3, doi: /j x Blackwell ORIGINAL specification Publishing ARTICLES search Asia using genetic algorithm Model specification search using a genetic algorithm with factor reordering for a simple structure factor analysis model HIROTO MUROHASHI 1 and HIDEKI TOYODA Faculty of Letters, Arts and Sciences, Waseda University, Shinjuku-ku, Tokyo , Japan Abstract: Many techniques for automated model specification search based on numerical indices have been proposed, but no single decisive method has yet been determined. In the present article, the performance and features of the model specification search method using a genetic algorithm (GA) were verified. A GA is a robust and simple metaheuristic algorithm with great searching power. While there has already been some research applying metaheuristics to the model fitting task, we focus here on the search for a simple structure factor analysis model and propose a customized algorithm for dealing with certain problems specific to that situation. First, implementation of model specification search using a GA with factor reordering for a simple structure factor analysis is proposed. Then, through a simulation study using generated data with a known true structure and through example analysis using real data, the effectiveness and applicability of the proposed method were demonstrated. Key words: factor analysis, model specification search, genetic algorithm, combinational optimization problem, structural equation modeling. Model specification search (Long, 1983) is one of the important problems in structural equation modeling (SEM). In application studies in particular, models generated from researchers hypotheses rarely fit the collected data perfectly (Henderson & Denison, 1989). Improvement of models based on actual data is therefore an essential process in SEM. In addition, researchers do not always have an explicit hypothesis and are eager to use their data to explore finding an appropriate model (Sörborm, 1989). In such situations, model specification search plays an important role. The basic concept behind model specification search, which was first proposed by Akaike (1987), is to compare SEM models and choose the best-fitting model based on fit indices. Following this concept, we can regard determining model specification as a function whereby a certain model is substituted for it and the goodness of that model is returned in numerical form. Thus, model specification search in SEM is treated as a combinational optimization problem (Marcoulides & Drezner, 2001). Numerous automated model specification search methods have been proposed to date (e.g., Bentler, 1995; Jöreskog & Sörborm, 1996; Spirtes, Glymour, & Scheines, 1993). However, almost all algorithms of widely used methods change only one parameter at one time according to a certain gradient. Their search space is therefore limited in the neighborhood around the initial proposed model, and the final selected model is perfectly dependent on the initial model. So, they could be classified into local search methods. 1 Correspondence concerning this article should be sent to: Hiroto Murohashi, , Kounandai, Kounan-ku, Yokohama, Kanagawa , Japan. ( murohashi@es99.net) 2007 Japanese Psychological Association. Published by Blackwell Publishers Ltd.

2 180 H. Murohashi and H. Toyoda Because of dependency on the starting value, local search methods have a critical problem in that they risk obtaining local optimal solutions (Harwood & Scheines, 2002). Hence, there is no guarantee that local search methods will find globally optimal solutions. One effective alternative that can be employed to overcome this problem is metaheuristics. Metaheuristics is a generic term for nondeterministic repetition algorithms that solve combinational optimization problems. It includes simulated annealing (Cerny, 1985), genetic algorithms (Holland, 1975), tabu search (Glover, 1989, 1990), simulated evolution (Kling & Banerjee, 1991), and stochastic evolution (Saab & Rao, 1991). As with the local search method, there is no assurance that metaheuristics can find the globally optimal solution. However, they are known to be much more robust than local search methods (Sait & Youssef, 2000). So, when search space is large and ragged, it is recommended to use metaheuristics. Some previous studies have already attempted to apply metaheuristics to the model fitting task. For example, Marcoulides, Drezner, and Schumacker (1998) used tabu search for SEM, and Harwood and Scheines (2002) employed a genetic algorithm (GA) to determine a direct acyclic graph. In both of these studies, metaheuristics showed excellent results in simulation studies. Other studies have applied reversible jump Markov chain Monte Carlo methods for multiple change-point analysis (Green, 1995) and used tabu search for variable selection in multiple regression analysis (Mills, Olejnik, & Marcoulides, 2005). Therefore, applying metaheuristics to model specification search in SEM is expected to yield good results. However, as previously mentioned, metaheuristics includes many kinds of algorithms, so we decided to focus on the GA in the present study. The GA is the optimization technique proposed by Holland (1975) and is a representative algorithm of metaheuristics. The unique feature of the GA is that it is based on the mechanism of natural selection, and thus is also called evolutionary computation (Bäck, Fogel, & Michalewicz, 1997). While the GA is still a developing theory, many studies have already been conducted and it has been used in application with some success due its simplicity and robustness (e.g., Chambers, 1995; Gabbert, Brown, Huntley, Markowicz, & Sappingston, 1991; Syswerda & Palmucci, 1991). Furthermore, Marcoulides and Drezner (2001) indicated the capability of applying a GA to model specification search in SEM. The basic aim of our study is to test this out. Of course, it is possible that algorithms other than a GA would work better than a GA for model specification search in SEM, but, as a first step, it is important to verify whether a GA, which is one of the most widely applied and popular methods in metaheuristics, is capable of application. As will be described later, when using a GA, we must encode the solution of the target problem into chromosomal format. Marcoulides and Drezner (2001) demonstrated an important method for overcoming this problem and indicated the capability of applying a GA to a model specification search in SEM. However, the specific algorithm itself was very primitive, so applying it without any modifications to real data analysis may lead to problems with accuracy and efficiency of model searching. The GA is a comparatively simple algorithm, but for that reason it is necessary to tune the implemented algorithm in accordance with the nature of the target problem in order to achieve fast and accurate searching (Man, Tang, & Kwong, 1999). The first problem with the algorithm by Marcoulides and Drezner (2001) is that model complexity is not considered. This causes a tendency to choose models in which the degree of freedom is small. That being said, in a practical situation, we usually do not try to employ such a complicated model, but instead employ a simpler model that is easier to interpret. The second problem is that they do not account for equivalent models. Equivalent models are different in path diagram and represent different interpretations of the data, but have the same estimated covariance matrix. Therefore, we cannot distinguish them from the value of fit indices. The third problem relates to the rotational indefiniteness of factors. Many models assumed in SEM contain factors,

3 Model specification search using genetic algorithm 181 particularly in the field of psychology. Then, there could be models that have different factor loading patterns, but substantially represent the same structure. Carrying out a model specification search without taking into account these latter kinds of problems may disturb the search. Moreover, we must consider the calculation amount when using a GA, because a GA generally needs much more processing time than other algorithms categorized as metaheuristics (Youssef, Sadiq, Nassar, & Benten, 1995). Thus, the implemented algorithm should be carefully designed to demand less computational effort, otherwise the task-elapsed time of searching will increase, making it harder to put a GA-based model specification search into practical use. In accordance with these problems, to confirm the availability and performance of a GA for model specification search in SEM from a practical standpoint, we confined the search space to a factor analysis model in this study. This restriction should decrease the computational effort of searching and solve problems that are unique to model specification search in SEM, but which are not considered in the algorithm of Marcoulides and Drezner (2001). A more detailed explanation of these advantages will be given later. Although reducing search space damages the generality of discussion to some extent and it becomes impossible to refer to model specification search in general SEM, the factor analysis model is very popular in the field of psychology. Thus, we expect our study to be particularly valuable from the viewpoint of application. We also focus attention on parameter setting in the GA. One of the attributes of metaheuristics is that its performance changes according to the parameters (Nonobe & Ibaraki, 1998). Despite this important factor, very few previous studies have explored the effect of parameter setting for model specification search in a factor analysis model or in general SEM. We therefore examined parameter setting firstly in a simulation study, and then using real data analysis application. We also applied a GA model specification search to Holzinger and Swineford s (1939) test data and compared our results with the existing analysis. Method of the genetic algorithm When using a GA, we must express the solution of the target problem as a set of certain parameters (Man et al., 1999). The GA treats each parameter as a gene and refers to the set of all genes as a chromosome. For example, if the target problem is to minimize the function f(x 1, x 2 ), genes would be the value of parameters x 1 and x 2 and the connected character string [x 1, x 2 ] would be regarded as the chromosome. With this analogy, the GA operates on the chromosomes instead of the solution itself, and attempts to identify the chromosome that represents the optimal solution based on the mechanism of natural selection. The basic procedure of the GA is as follows: 1. Initialization: Generate individuals at random to form the initial population. 2. Reproduction: Select individuals from the population and apply crossover operator and mutation operator to create new individuals. 3. Replacement: Merge existing individuals included in the population and newborn individuals created through the reproduction process. Then select individuals with good chromosomes and reform the population. 4. Repetition: Return to the reproduction process if a stopping criterion is not satisfied. Initialization process The initial step involves generating individuals and forming the initial population. Generally, the chromosomes of these individuals are randomly decided, although the specific procedure depends on the chromosome format. Thus, it is important to decide the encoding rule for how information about the solution of the target problem is encoded into the chromosome. Sometimes, the solutions obtained from different optimization algorithms are used as the initial population of the GA. This method is called seeding (Davis, 1991). The size of the population is also important to consider. Population size is equal to the variety of chromosomes kept in the population and is directly linked to the accuracy of searching. However, it is also directly linked to

4 182 H. Murohashi and H. Toyoda the calculation amount (Youssef et al., 1995), so we must choose an appropriate population size to solve the target problem. Reproduction process The reproduction process constitutes the core of optimization conducted by the GA. This process should be designed to evoke the survival of the fittest mechanism, which means that the individual with a better chromosome is more likely to create a larger number of offspring. This process is thus expected to generate a better solution. To complete this process, we must determine the degree of goodness of each chromosome at the beginning. However, this totally depends on the target problem and there is no general rule to apply in making this determination. In addition, it is also important how individuals are selected to mate. Of the many selection methods proposed, almost all are based on the goodness of chromosome and stochastically often select the better individual. After selecting individuals to breed, the next step is to generate new chromosomes from the selected parents chromosomes. This is equivalent to crossover in the natural world. As with selection, many methods for crossover are proposed, but these methods commonly split the parents chromosomes into small parts and combine them to create the offspring s chromosome. The final process of reproduction is mutation. Mutation changes chromosomes randomly in certain probability. Usually, the mutation rate is not so high, but it is an essential process for the GA (Sait & Youssef, 2000). Determination of the mutation rate therefore affects the accuracy and speed of the GA. Replacement process After reproduction, we must replace the old population with newborn offspring. This step is called replacement. The major problem associated with replacement is how many individuals should be replaced by offspring at one time. This replacement rate is called the generation gap and what generation gap works best depends on the target problem. Another problem is how to determine the individuals to be replaced by offspring. There are two main types of replacement method: deciding it randomly and deciding it based on the goodness of chromosome. Repetition process Executing the reproduction process and replacement process once is called one generation and this is the basic unit of iteration in the optimization cycle of the GA. Therefore, this cycle will be repeated until certain stopping criteria are fulfilled. Conceivable criteria are: (a) find an individual with a sufficiently good chromosome; (b) repeat the cycle a certain number of times; and (c) stop estimation when the pace of chromosome improvement slows. This is not a problem specific to the GA, but is closely relevant to all iterative optimization algorithms. These are the basic procedures of the GA. The most remarkable feature of the GA is to intentionally keep individuals with relatively bad chromosomes and mix them randomly into the reproduction process. This widens the searching area outside the neighborhood of the initial population. Mutation also supports this. As a result, the GA can search a wider area than the local searching method, which increases the possibility of finding a global optimal solution independently of the initial population. However, because the above-mentioned simple GA is more likely to expand the search area, it is not suited to examining a narrow area carefully. Therefore, the simple GA suffers from the disadvantages of slow convergence. To improve this, the local search method is generally combined with the GA nowadays (Sait & Youssef, 2000). These hybrid methods are sometimes called genetic local search (GLS) methods to distinguish them from the simple GA. The algorithm proposed in the present study is one kind of GLS. Genetic algorithm for model specification search of a factor analysis model In this section, we will describe the details of the GA used in this study. The purpose of this

5 Model specification search using genetic algorithm 183 study is to propose an algorithm for model specification search of a factor analysis model. However, the algorithm described below is actually a search algorithm only for a simple structure factor analysis model. This is because we intend to address the problems explained in the Introduction. Researchers usually tend to use the model searching method when they have limited knowledge about their data. In such a situation, confining the search space to a simple structure seems to be too restrictive and is inadequate. Despite this, we employed this restriction because we could find a sufficiently practical and good-fitting factor analysis model using an existing automated model searching method that starts from a simple structure factor analysis model found using the GA. Of course, real data hardly ever has a completely simple structure, but in a practical situation, we often strive to obtain a simple structure because a simple structure model is easier to interpret. Thus, a procedure that finds a good-fitting simple structure factor analysis model first and then explores around that model is not so far from the real data analysis situation and meets the purpose. Accepting the use of this procedure, we confined the search space of our algorithm to simple structure factor analysis. Coding First, we have to determine the rules for how to encode the information about model specification into the chromosome. If we assume that E[f] = 0, E[e] = 0, E[e ] = 0, then the covariance structure of the factor analysis model is represented as follows: Σ() θ = A Σ A + Σ f e (1) where f is the factor vector, A is the factor loading matrix, e is the error variable vector, θ is the parameter vector, Σ f is the factor covariance matrix, and Σ e is the error covariance matrix. Here, we introduce five setups as follows: 1. Limit search space to simple factor first order factor analysis model. 2. Fix the number of observed variables and factors prior to start searching. 3. Fix the factor s variance at one to identify confirmatory factor analysis model. 4. Always estimate all covariance among factors because it is highly unlikely that there are no correlations among factors at all. 5. Do not estimate error covariance among observed variables. With these assumptions, whether elements are to be fixed or to be free is determined for Σ f and Σ e in Equation 1. From assumptions 3 and 4, the diagonal elements of Σ f are to be fixed at one, and other elements are to be freely estimated. From assumption 5, the diagonal elements of Σ e are to be free, and other elements are to be fixed at zero. Consequently, we have to think only about the elements of matrix A. A is a factor loading matrix, so whether its parameter is estimated or not relates to whether each observed variable measures the corresponding factor or not. And from assumption 1, we only have to think about simple structure, meaning that every observed variable measures only one factor. Thus, we can use the information of which factor the observed variables measure as the value of the chromosome. For example, if A is λ λ A = 0 λ, (2) λ λ53 then the corresponding chromosome is derived as follows: [ ] (3) This is the coding rule employed in this study. By limiting the search target to a simple structure model, the search area is drastically reduced when compared with searching for a general factor analysis model. So, we can expect to lessen the calculation amount. This can also deal with the problem of model

6 184 H. Murohashi and H. Toyoda complexity because all candidate models come to have the same degree of freedom. However, problems remain with regard to an equivalent model and the rotational indefiniteness of factors. To deal with these problems, we introduced a factor reordering operator that reorders factors in ascending order corresponding to the observed variables assigned to the youngest number among the observed variables measuring the same factor. For example, 0 0 λ λ 23 A = 0 λ λ 0 42 λ will be reordered in the same manner as Equation 2; thus the corresponding chromosome is the same as Equation 3. In this study, this reordering operator was always applied when the algorithm generated a new chromosome. Initialization process According to the coding rule, the chromosome should be a character string consisting of integer numbers up to the number of factors. Thus, chromosomes of individuals in the initial population can be generated based on the random variables sampled from a multinomial distribution. However, we must be aware of the possibility of generating individuals in which the chromosome represents an inestimable model specification. This can happen not only in the initialization process but also in the reproduction process. In this study, such individuals were removed immediately from the population. Alternatively, initialization or reproduction was re-executed as necessary. Individuals with fewer factors than the number of factors assumed at the beginning of searching were also removed. Reproduction process To complete this process, we must in some way determine each individual s degree of goodness, and fortunately there are many fit indices in SEM that can be used. In this study, we chose root mean square error of approximation (RMSEA) (Browne & Cudeck, 1993; Steiger & Lind, 1980) as an index to denote the goodness of chromosome. However, as the RMSEA value becomes smaller with improving model fit and the optimum value of RMSEA is zero, its magnitude represents the badness of chromosome rather than the goodness. Hence, we actually used the inverse of RMSEA and when RMSEA = 0, we regarded the value of goodness as le As a selecting method, we employed the roulette wheel selection algorithm. This algorithm first makes a virtual roulette that represents the proportion of degree of fitness of all the population members such that good chromosomes cover a wide area on the roulette wheel and bad chromosomes cover a narrow area. Then the wheel is spun and two individuals are selected to be parents. As a crossover method, we employed the uniform crossover algorithm. This algorithm produces offspring from the parents chromosomes, based on a randomly generated binary mask. The mask should have the same length as the parents chromosomes and their value of genes is randomly determined from the Bernui distribution, which has a success rate of For example, if the parents chromosomes are Parent 1: [ ] Parent 2: [ ], then the mask might be determined as follows: [ ]. Of course, each value constituting the mask can vary from case to case, but with the aboveindicated mask, the offspring s chromosomes will be: Child 1: [ ] Child 2: [ ]. This means child 1 inherits genes from parent 1 where the mask value is 0 and from parent 2

7 Model specification search using genetic algorithm 185 where the mask value is 1. Conversely, child 2 inherits genes from parent 1 where the mask value is 1 and from parent 2 where the mask value is 0. The mask is randomly determined each time the crossover operator is executed; hence the combination pattern of the parents chromosomes varies at every crossover. Finally, each gene of the children s chromosomes is checked individually based on a certain mutation rate and, if mutation occurs, the value of that gene will be changed. For example, if the number of factors is three and the present value of the gene is two, the value will change to one or three upsides. Replacement process In this study, we basically decided that individuals would be replaced stochastically based on the goodness of chromosome. Actually, we first generated as many offspring as the original population size, then randomly selected individuals to survive into the next generation from the existing individuals in the population and newborn offspring in proportion to their goodness of chromosome. However, the probabilistic sampling was without replacement, so each individual remaining in the next population had a different chromosome and there were no overlaps. Moreover, if all individuals were chosen to survive stochastically, a good chromosome in a certain generation might fail to pass on information to the succeeding generation, causing inefficiency of optimization. To avoid this, we employed elitism (De Jong, 1975). Elitism is a strategy by which to keep some individuals with good chromosomes, and allow them to be included in the next generation preferentially. It is said that this strategy improves the performance of optimization. We kept only one individual with the best chromosome as an elite in every generation. Repetition process In a practical situation, we do not know the optimum value of RMSEA for the target data. We therefore stopped estimation when the pace of improvement in the goodness of chromosome slowed. Specifically, if RMSEA of the best-fitting individuals in a population did not improve for three consecutive generations, we terminated searching and adopted the best chromosome as a result. In addition, when a chromosome was found for which RMSEA was zero, we stopped searching. Local searching method To make convergence faster, we also adopted a local search method. The implemented procedure was basically to search the one-flip neighborhood of the existing model such that the model could be better explored by changing only one gene of the base model. This method is indicated in Marcoulides and Drezner (2001), but what is different in the present study is that the search was executed in order from the first gene and, if we found a better fitting model than the base model, we stopped searching at that time and adopted that model as the result of the local search. For example, if the number of factors is assumed to be three and the chromosome of the base model is: [ ], then first we examine two models for which the chromosomes are: [ ] [ ]. If either one of these has better RMSEA than the first model, that model is adopted and finishes the search. But if RMSEA is not improved, then we try to change the second gene and examine the two models as follows: [ ] [ ]. Because this method does not search the one-flip neighborhood completely, the bestfitting model could be overlooked. However, to examine all possible models in the one-flip neighborhood and to choose the best one as in Marcoulides and Drezner (2001) requires much time, and loading increases according to the number of observed variables and factors. So,

8 186 H. Murohashi and H. Toyoda considering application to real data analysis, we employed a simpler but faster method. Simulation study Method We conducted a simulation study to verify our algorithm. A simulation generated data from a known true structure and we examined whether a GA could find the true model from that data or not. As a true structure, we prepared three different types of setup: (a) simple factor analysis model on three factors with 12 observed variables where each factor is measured by four observed variables and all factors are correlated; (b) add one more factor loading to the first model, so one factor is then measured by five observed variables; and (c) in addition to the first model, each factor is newly measured by one more observed variable, thus adding three paths for factor loading. At each replication in the simulation study, we first generated a true covariance matrix from a randomly determined true structure following the above-mentioned setups. Which variables measured which factor was decided randomly, and all parameters were stochastically determined according to continuous uniform distributions, the ranges of which were as follows: (a) factor loadings were to be within ; (b) factor correlations were to be within ; and (c) residual variances were to be within Then, we generated 1000 samples based on a multivariate normal distribution in which the covariance matrix equaled the determined true covariance matrix, and used these samples as data for that replication. When the true structure was a simple structure, we measured the success or failure of the GA search according to whether the true model and the found model had an identical factor pattern or not. When the true structure was not a simple structure, we judged the search to be successful if the estimated factor pattern was included in the true factor pattern. For example, if observed variable 1 measures factors 1 and 3 in the true pattern, the estimated pattern in which variable 1 measures factors 1 or 3 is fine, but where it measures factor 2 it is regarded as a failure. We considered not only the success and failure of searching in this study, but also elapsed generations and central processing unit (CPU) time. We also ran a model specification search under some different settings of GA parameters and compared the results. In particular, we changed the population size and the mutation rate. The number of populations was set across 5, 10, and 20, and the mutation rate was set to 0.01, 0.05, and 0.1. All other parts of the algorithm remained unchanged and as described in the former section. Chromosomes of individuals for the initial population were generated based on a trinomial distribution for which each result is equally probable. The mask for crossover was generated from a binomial distribution that has a success rate of 0.50 and the number of trials was 12. We used a total of nine patterns of parameter settings for the GA, and three types of true structure for data generation. Thus, there were a total of 27 conditions in the simulation study. We repeated the data generation and the model specification search 50 times for each condition. All calculations were executed on Windows-based R (R Development Core Team, 2006), and the SEM package (Fox, 2006) was used to compute RMSEA. Results Table 1 shows the success rate of the model specification search. When the true data structure approaches a simple structure, the exploration algorithm works well. The success rate increases as the population size increases and exceeds When the true structure is not a simple structure, the success rate becomes comparatively lower, but it still remains near So, it is probable that our proposed algorithm can find the best-fitting model. Figure 1 shows one example of the searching process when the true data was a simple structure, population size was 20 and the mutation rate was A good-fitting model suddenly appears in the search process and other individuals fitness is improved in later generations due to the best model.

9 Model specification search using genetic algorithm 187 Table 1. Success rate for finding the true model Table 2. Mean and SD of genetic cycles spent on finding true structure Population size Population size Mutation rate Mutation rate Simple structure Add one path to simple structure Add three paths to simple structure Simple structure (2.08) 6.62 (1.81) 5.16 (1.39) (1.68) 6.33 (1.56) 5.48 (1.80) (1.93) 6.11 (1.99) 5.68 (1.84) Add one path to simple structure (1.61) 7.77 (1.51) 7.02 (1.54) (2.13) 8.42 (1.98) 6.54 (1.17) (1.59) 7.45 (1.57) 6.89 (1.34) Add three paths to simple structure (2.31) 8.20 (1.66) 7.43 (1.85) (1.99) 7.48 (1.57) 6.92 (1.32) (1.70) 8.09 (1.41) 7.79 (1.64) The replications that failed to find true structure are not included. SD are given in parentheses. Figure 1. One example of root mean square error of approximation (RMSEA) change history during the genetic algorithm search process. Table 2 shows the means and standard deviations (SD) of the number of generations for which the algorithm could find the true model. From this table, we can say that with a large population size, the elapsed generations of the GA reduced. However, this does not mean that a large population size improves the search speed. Table 3 shows the means and SD of CPU time for which the algorithm could find the true model. This table contains the results only when the true data was a simple structure, and the Athlon computer we used had a 2-GB memory. From Table 3, the calculation time increases according to the population size. This is because most of the computational effort was required to compute RMSEA in this algorithm. Thus, searching is faster with a larger number of generations and a smaller sized population. Given the above-mentioned results, we can say that with a large population size, the bestfitting model is highly likely to be found in a small number of generations. However, this does not necessarily reflect a faster speed in actuality. Metaheuristics cannot be relied upon to find a global optimal solution and it is recommended to repeat the search more than once. A repeat search that balances speed and accuracy is therefore desirable. A final point of note is that the effect of mutation rate is complicated and hard to interpret. Example analysis Method Here, we analyzed real data using a GA and examined its performance. As nobody can

10 188 H. Murohashi and H. Toyoda Table 3. Mean and SD of central processing unit time spent on finding true structure when true structure is a simple structure Population size Mutation rate (250.61) (428.70) (721.70) (224.92) (393.76) (785.75) (234.95) (406.82) (795.31) Measurement unit is seconds. The replications that failed to find true structure are not included. SD are given in parentheses. know the real structure in natural data, we compared the result from the GA with the result from exploratory factor analysis (EFA). We used Holzinger and Swineford s (1939) test results as data. Data included the results of a cognitive achievement test. In this study, we used the results of Grant-White Elementary School and the number of examinees was 145. The test consisted of 26 items. However, because items 3 and 25 and items 4 and 26 were designed to measure the same abilities with the only difference between them related to degree of difficulty, we used the easier items (items 25 and 26) and dropped the harder ones (items 3 and 4), thus using data for a total of 24 items in the analysis. These items were chosen from five different areas: (a) spatial (items 1, 2, 25, 26); (b) verbal (items 5 9); (c) speed (items 10 13); (d) memory (items 14 19); and (e) mathematical memory (items 20 24). After analyzing the collected data, Holzinger and Swineford (1939) concluded that a four-factor structure better fits this data. These data have been reanalyzed many times in the context of factor analysis (Carroll, 1993), and researchers have employed different numbers of factors, but in the present study we employed the assumed four-factor structure model so as to follow the original study. The GA algorithm was employed in the same manner as described in the former section: chromosomes of individuals in the initial population were generated based on a quadnomial distribution where each result is equally probable, and the mask for crossover was generated from a binomial distribution that has a success rate of 0.50 and the number of trials is 24. The population size was set to 10, and the mutation rate was set to With these parameters, we ran the GA 20 times and obtained 20 models. We decided these parameters arbitrarily and empirically based on preliminary analysis. At the same time, as a target for comparison, we carried out EFA with the same data. First, we estimated factor loadings using maximum likelihood estimation and rotated it using three different rotation methods: Promax, Direct Oblimin, and Harris-Kaiser. Then, based on the results, we considered that each observed variable measures the factor with the highest factor loading and derived a simple structure from the EFA result. This is the standard procedure to derive a simple structure in exploratory data analysis. Result Table 4 contains the fit indices of the models found using EFA and a GA. The most frequently found model using the GA had the best RMSEA and it was also a model found using Promax rotation. Thus, for these data, both the GA and EFA could find the best-fitting model. Because of the arbitrariness of parameter setting in the GA, it is possible to obtain different search results. Thus, there are advantages and disadvantages to using a GA; although, at the very least, the fact that a GA can perform as well as EFA has been confirmed. Moreover, the model found using the GA has a somewhat different structure from the

11 Model specification search using genetic algorithm 189 Table 4. Fit index and factor pattern of found models Searching method RMSEA Factor pattern a) Genetic algorithm b) Exploratory factor analysis c) Promax Harris Kaiser Direct Oblimin a) Marks are listed in ascending order of item numbers and i-th mark represents the factor which i-th item measures. So, items which have the same marks consist of one factor. Note the lack of items 3 and 4. b) Number indicates from 20 repetitions how many times the proposed algorithm found that factor pattern. c) Named rotation method indicates that the corresponding factor pattern was obtained by EFA using that rotation method., the model for which a corresponding factor pattern was not found; RMSEA, root mean square error of approximation. three EFA models. The models chosen using EFA are so much alike that the items for mathematical ability are divided into space and speed. In contrast, the items for memory are divided into other categories in the GA model. From this viewpoint, it can be said that it is possible for model specification search using a GA to find structure different from that found by a search using EFA, but each has almost identical good-fit indices. Discussion Through a simulation study and real data analysis, the performance of model specification search using a GA is demonstrated. The algorithm we propose here could find the true model when analyzing artificial data in certain probability within a reasonable time. It could also find a good-fitting model when analyzing real data. Thus, we can say that model specification search using a GA for a simple structure factor analysis has sufficient applicability. In particular, it was confirmed that our proposed algorithm can find a huge variety of models that are difficult to find using EFA. This feature of our algorithm will contribute to model specification search where researchers do not have an explicit hypothesis and want to do data-driven model construction. Because the GA is a metaheuristic searching method, it cannot always find the best model. This feature is obvious from the simulation study results. Consequently, it is highly recommended that researchers repeat the model specification search more than two times and choose their own best model, not only by considering their fit indices but also by their interpretability and consistency with each field s theory. Care should also be taken with regard to the parameters of the GA. Population size particularly is related to search speed and accuracy. Researchers must therefore set a sufficiently large population size for their data, but too large a population size may slow the search. Therefore, they must determine the appropriate

12 190 H. Murohashi and H. Toyoda population size with no simple rule to determine it. Perhaps some studies that attempted to adjust parameters automatically in the estimation process itself may be helpful in tackling this issue (e.g., Beasley, 1993; Nonobe & Ibaraki, 1998). We recognize the need to improve the speed and the accuracy of our algorithm and we can approach these problems mainly from two different perspectives. One is to apply the latest advances in metaheuristics. Metaheuristics is an ongoing research area and many kinds of algorithms are currently being proposed. These will prove to be of help in improving our algorithm. The other approach is to rewrite and optimize the program code. In this study, we used R system. R has high flexibility and is easy to use, but it is a script language, and processing speed is therefore not so high. Using a high-level language certainly makes searching faster. A final consideration we note is that this study handled the model specification search only for a simple structure factor analysis model. Nevertheless, employing a GA for the model specification search for a simple structure factor model was found to be helpful. The application of metaheuristics for the model specification search in general SEM thus appears to be effective and powerful, and is easy to apply. This topic merits further research. References Akaike, H. (1987). Factor analysis and AIC. Psychometrika, 52, Bäck, T., Fogel, D. B., & Michalewicz, Z. (Eds.). (1997). Handbook of evolutionary computation. New York: IOP Publishing and Oxford University Press. Beasley, J. E. (1993). Lagrangian heuristics for location problems. European Journal of Operational Research, 65, Bentler, P. M. (1995). EQS structural equation program manual. Encino, CA: Multivariate Software Inc. Browne, M. W., & Cudeck, R. (1993). Alternative ways of assessing model fit. In K. Bollen & J. S. Long (Eds.), Testing structural equations (pp ). Newbury Park, CA: Sage Publications. Carroll, J. B. (1993). Human cognitive abilities: a survey of factor-analysis studies. Cambridge: Cambridge University Press. Cerny, V. (1985). Thermodynamical approach to the traveling salesman problem: An efficient simulation algorithm. Journal of Optimization Theory and Application, 45, Chambers, L. (Ed.). (1995). Practical handbook of genetic algorithms: applications (Vol. I). Boca Raton, FL: CRC Press. Davis, L. (1991). Handbook of genetic algorithms. New York: Van Nostrand Reinhold. De Jong, K. A. (1975). An analysis of the behavior of a class of genetic algorithms (Doctoral Dissertation, University of Michigan, 1975). Dissertation Abstracts International, 36 (10), 5140B. Fox, J. (2006). Sem: structural equation models. R Package, (Computer software), Version [Downloaded 3 Jun 2006.] Available from URL: Gabbert, P., Brown, D., Huntley, C., Markowicz, B., & Sappingston, D. (1991). A system for learning routes and schedules with genetic algorithms. In Proceedings of ICGA-91. Morgan Kaufmann Publishers. Glover, F. (1989). Tabu search Part I. ORSA Journal of Computing, 1, Glover, F. (1990). Tabu search Part II. ORSA Journal of Computing, 2, Green, P. J. (1995). Reversible jump Markov chain Monte Carlo computation and Bayesian model determination. Biometrika, 82, Harwood, S., & Scheines, R. (2002). Genetic algorithm search over causal models (Tech. Rep. CMU- PHIL-131). Pittsburgh, PA: Carnegie Mellon University, Department of Philosophy. Henderson, D. A., & Denison, D. R. (1989). Stepwise regression in social and psychological research. Psychological Reports, 64, Holland, J. H. (1975). Adaptation in natural and artificial systems: an introductory analysis with applications to biology, control, and artificial intelligence. AnnArbor, MI: University of Michigan Press. Holzinger, K. J., & Swineford, F. (1939). A study in factor analysis: the stability of a bi-factor solution (Supplementary Educational Monographs 48). Chicago, IL: The University of Chicago. Jöreskog, K. G., & Sörborm, D. (1996). LISREL 8: user s reference guide. Chicago, IL: Scientific Software. Kling, R. M., & Banerjee, P. (1991). Empirical and theoretical studies of the simulated evolution method applied to standard cell placement. IEEE Transaction on Computer-Aided Design, 10, Long, J. S. (1983). Covariance structure models: an introduction to LISREL. Beverly Hills, CA: Sage. Man, K. F., Tang, K. S., & Kwong, S. K. (1999).

13 Model specification search using genetic algorithm 191 Genetic algorithms: concepts and designs. London: Springer-Verlag. Marcoulides, G. A., & Drezner, Z. (2001). Specification searches in structural equation modeling with a genetic algorithm. In G. A. Marcoulides & R. E. Schumaker (Eds.), Advanced structural equation modeling: new developments and techniques in structural equation modeling. Mahwah, NJ: Lawrence Erlbaum Associates, Inc. Marcoulides, G. A., Drezner, Z., & Schumacker, R. E. (1998). Model specification searches in structural equation modeling using tabu search. Structural Equation Modeling, 5, Mills, J. D., Olejnik, S. F., & Marcoulides, G. A. (2005). The tabu search procedure: An alternative to the variable selection methods. Multivariate Behavioral Research, 3, Nonobe, K., & Ibaraki, T. (1998). A tabu search approach to the constraint satisfaction problem as a general problem solver. European Journal of Operational Research, 106, R Development Core Team (2006). R: a language and environment for statistical computing (Computer software). Vienna, Austria: R Foundation for Statistical Computing. Saab, Y. G., & Rao, V. B. (1991). Combinational optimization by stochastic evolution. IEEE Transactions on Computer-Aided Design, 10, Sait, S. M., & Youssef, H. (2000). Iterative computer algorithms with applications in engineering: solving combinational optimization problems. New York: Wiley-IEEE Computer Society Press. Sörborm, D. (1989). Model modification. Psychometrika, 54, Spirtes, P., Glymour, C., & Scheines, R. (1993). Causation, prediction and search. New York: Springer-Verlag. Steiger, J. H., & Lind, J. C. (1980). Statistically based tests for the number of common factors. Paper presented at the Annual Meeting of the Psychometric Society, Iowa City, IA. Syswerda, G., & Palmucci, J. (1991). The application of genetic algorithm to resource scheduling. In Proceedings of ICGA-91. San Francisco, CA: Morgan Kaufmann Publishers. Youssef, H., Sadiq, M., Nassar, K., & Benten, M. S. T. (1995). Performance driven standard-cell placement using the genetic algorithm. Paper presented at the Fifth Great Lakes Symposium on VLSI, Buffalo, NY. (Received May 13, 2005; accepted March 3, 2007)

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization 2017 2 nd International Electrical Engineering Conference (IEEC 2017) May. 19 th -20 th, 2017 at IEP Centre, Karachi, Pakistan Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic

More information

The Genetic Algorithm for finding the maxima of single-variable functions

The Genetic Algorithm for finding the maxima of single-variable functions Research Inventy: International Journal Of Engineering And Science Vol.4, Issue 3(March 2014), PP 46-54 Issn (e): 2278-4721, Issn (p):2319-6483, www.researchinventy.com The Genetic Algorithm for finding

More information

Metaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini

Metaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini Metaheuristic Development Methodology Fall 2009 Instructor: Dr. Masoud Yaghini Phases and Steps Phases and Steps Phase 1: Understanding Problem Step 1: State the Problem Step 2: Review of Existing Solution

More information

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 20 CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 2.1 CLASSIFICATION OF CONVENTIONAL TECHNIQUES Classical optimization methods can be classified into two distinct groups:

More information

DETERMINING MAXIMUM/MINIMUM VALUES FOR TWO- DIMENTIONAL MATHMATICLE FUNCTIONS USING RANDOM CREOSSOVER TECHNIQUES

DETERMINING MAXIMUM/MINIMUM VALUES FOR TWO- DIMENTIONAL MATHMATICLE FUNCTIONS USING RANDOM CREOSSOVER TECHNIQUES DETERMINING MAXIMUM/MINIMUM VALUES FOR TWO- DIMENTIONAL MATHMATICLE FUNCTIONS USING RANDOM CREOSSOVER TECHNIQUES SHIHADEH ALQRAINY. Department of Software Engineering, Albalqa Applied University. E-mail:

More information

V.Petridis, S. Kazarlis and A. Papaikonomou

V.Petridis, S. Kazarlis and A. Papaikonomou Proceedings of IJCNN 93, p.p. 276-279, Oct. 993, Nagoya, Japan. A GENETIC ALGORITHM FOR TRAINING RECURRENT NEURAL NETWORKS V.Petridis, S. Kazarlis and A. Papaikonomou Dept. of Electrical Eng. Faculty of

More information

A Web-Based Evolutionary Algorithm Demonstration using the Traveling Salesman Problem

A Web-Based Evolutionary Algorithm Demonstration using the Traveling Salesman Problem A Web-Based Evolutionary Algorithm Demonstration using the Traveling Salesman Problem Richard E. Mowe Department of Statistics St. Cloud State University mowe@stcloudstate.edu Bryant A. Julstrom Department

More information

Investigating the Application of Genetic Programming to Function Approximation

Investigating the Application of Genetic Programming to Function Approximation Investigating the Application of Genetic Programming to Function Approximation Jeremy E. Emch Computer Science Dept. Penn State University University Park, PA 16802 Abstract When analyzing a data set it

More information

CHAPTER 4 GENETIC ALGORITHM

CHAPTER 4 GENETIC ALGORITHM 69 CHAPTER 4 GENETIC ALGORITHM 4.1 INTRODUCTION Genetic Algorithms (GAs) were first proposed by John Holland (Holland 1975) whose ideas were applied and expanded on by Goldberg (Goldberg 1989). GAs is

More information

ISSN: [Keswani* et al., 7(1): January, 2018] Impact Factor: 4.116

ISSN: [Keswani* et al., 7(1): January, 2018] Impact Factor: 4.116 IJESRT INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY AUTOMATIC TEST CASE GENERATION FOR PERFORMANCE ENHANCEMENT OF SOFTWARE THROUGH GENETIC ALGORITHM AND RANDOM TESTING Bright Keswani,

More information

1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra

1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra Pattern Recall Analysis of the Hopfield Neural Network with a Genetic Algorithm Susmita Mohapatra Department of Computer Science, Utkal University, India Abstract: This paper is focused on the implementation

More information

DERIVATIVE-FREE OPTIMIZATION

DERIVATIVE-FREE OPTIMIZATION DERIVATIVE-FREE OPTIMIZATION Main bibliography J.-S. Jang, C.-T. Sun and E. Mizutani. Neuro-Fuzzy and Soft Computing: A Computational Approach to Learning and Machine Intelligence. Prentice Hall, New Jersey,

More information

GENETIC ALGORITHM with Hands-On exercise

GENETIC ALGORITHM with Hands-On exercise GENETIC ALGORITHM with Hands-On exercise Adopted From Lecture by Michael Negnevitsky, Electrical Engineering & Computer Science University of Tasmania 1 Objective To understand the processes ie. GAs Basic

More information

Metaheuristic Optimization with Evolver, Genocop and OptQuest

Metaheuristic Optimization with Evolver, Genocop and OptQuest Metaheuristic Optimization with Evolver, Genocop and OptQuest MANUEL LAGUNA Graduate School of Business Administration University of Colorado, Boulder, CO 80309-0419 Manuel.Laguna@Colorado.EDU Last revision:

More information

Using Genetic Algorithm with Triple Crossover to Solve Travelling Salesman Problem

Using Genetic Algorithm with Triple Crossover to Solve Travelling Salesman Problem Proc. 1 st International Conference on Machine Learning and Data Engineering (icmlde2017) 20-22 Nov 2017, Sydney, Australia ISBN: 978-0-6480147-3-7 Using Genetic Algorithm with Triple Crossover to Solve

More information

4/22/2014. Genetic Algorithms. Diwakar Yagyasen Department of Computer Science BBDNITM. Introduction

4/22/2014. Genetic Algorithms. Diwakar Yagyasen Department of Computer Science BBDNITM. Introduction 4/22/24 s Diwakar Yagyasen Department of Computer Science BBDNITM Visit dylycknow.weebly.com for detail 2 The basic purpose of a genetic algorithm () is to mimic Nature s evolutionary approach The algorithm

More information

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you?

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you? Gurjit Randhawa Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you? This would be nice! Can it be done? A blind generate

More information

Introduction to Genetic Algorithms

Introduction to Genetic Algorithms Advanced Topics in Image Analysis and Machine Learning Introduction to Genetic Algorithms Week 3 Faculty of Information Science and Engineering Ritsumeikan University Today s class outline Genetic Algorithms

More information

Tabu search and genetic algorithms: a comparative study between pure and hybrid agents in an A-teams approach

Tabu search and genetic algorithms: a comparative study between pure and hybrid agents in an A-teams approach Tabu search and genetic algorithms: a comparative study between pure and hybrid agents in an A-teams approach Carlos A. S. Passos (CenPRA) carlos.passos@cenpra.gov.br Daniel M. Aquino (UNICAMP, PIBIC/CNPq)

More information

Learning Adaptive Parameters with Restricted Genetic Optimization Method

Learning Adaptive Parameters with Restricted Genetic Optimization Method Learning Adaptive Parameters with Restricted Genetic Optimization Method Santiago Garrido and Luis Moreno Universidad Carlos III de Madrid, Leganés 28911, Madrid (Spain) Abstract. Mechanisms for adapting

More information

Grid Scheduling Strategy using GA (GSSGA)

Grid Scheduling Strategy using GA (GSSGA) F Kurus Malai Selvi et al,int.j.computer Technology & Applications,Vol 3 (5), 8-86 ISSN:2229-693 Grid Scheduling Strategy using GA () Dr.D.I.George Amalarethinam Director-MCA & Associate Professor of Computer

More information

GA is the most popular population based heuristic algorithm since it was developed by Holland in 1975 [1]. This algorithm runs faster and requires les

GA is the most popular population based heuristic algorithm since it was developed by Holland in 1975 [1]. This algorithm runs faster and requires les Chaotic Crossover Operator on Genetic Algorithm Hüseyin Demirci Computer Engineering, Sakarya University, Sakarya, 54187, Turkey Ahmet Turan Özcerit Computer Engineering, Sakarya University, Sakarya, 54187,

More information

Automata Construct with Genetic Algorithm

Automata Construct with Genetic Algorithm Automata Construct with Genetic Algorithm Vít Fábera Department of Informatics and Telecommunication, Faculty of Transportation Sciences, Czech Technical University, Konviktská 2, Praha, Czech Republic,

More information

Optimization of Noisy Fitness Functions by means of Genetic Algorithms using History of Search with Test of Estimation

Optimization of Noisy Fitness Functions by means of Genetic Algorithms using History of Search with Test of Estimation Optimization of Noisy Fitness Functions by means of Genetic Algorithms using History of Search with Test of Estimation Yasuhito Sano and Hajime Kita 2 Interdisciplinary Graduate School of Science and Engineering,

More information

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM Journal of Al-Nahrain University Vol.10(2), December, 2007, pp.172-177 Science GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM * Azhar W. Hammad, ** Dr. Ban N. Thannoon Al-Nahrain

More information

International Journal of Digital Application & Contemporary research Website: (Volume 1, Issue 7, February 2013)

International Journal of Digital Application & Contemporary research Website:   (Volume 1, Issue 7, February 2013) Performance Analysis of GA and PSO over Economic Load Dispatch Problem Sakshi Rajpoot sakshirajpoot1988@gmail.com Dr. Sandeep Bhongade sandeepbhongade@rediffmail.com Abstract Economic Load dispatch problem

More information

Adaptive Crossover in Genetic Algorithms Using Statistics Mechanism

Adaptive Crossover in Genetic Algorithms Using Statistics Mechanism in Artificial Life VIII, Standish, Abbass, Bedau (eds)(mit Press) 2002. pp 182 185 1 Adaptive Crossover in Genetic Algorithms Using Statistics Mechanism Shengxiang Yang Department of Mathematics and Computer

More information

Sparse Matrices Reordering using Evolutionary Algorithms: A Seeded Approach

Sparse Matrices Reordering using Evolutionary Algorithms: A Seeded Approach 1 Sparse Matrices Reordering using Evolutionary Algorithms: A Seeded Approach David Greiner, Gustavo Montero, Gabriel Winter Institute of Intelligent Systems and Numerical Applications in Engineering (IUSIANI)

More information

Topological Machining Fixture Layout Synthesis Using Genetic Algorithms

Topological Machining Fixture Layout Synthesis Using Genetic Algorithms Topological Machining Fixture Layout Synthesis Using Genetic Algorithms Necmettin Kaya Uludag University, Mechanical Eng. Department, Bursa, Turkey Ferruh Öztürk Uludag University, Mechanical Eng. Department,

More information

entire search space constituting coefficient sets. The brute force approach performs three passes through the search space, with each run the se

entire search space constituting coefficient sets. The brute force approach performs three passes through the search space, with each run the se Evolving Simulation Modeling: Calibrating SLEUTH Using a Genetic Algorithm M. D. Clarke-Lauer 1 and Keith. C. Clarke 2 1 California State University, Sacramento, 625 Woodside Sierra #2, Sacramento, CA,

More information

CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES

CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES 6.1 INTRODUCTION The exploration of applications of ANN for image classification has yielded satisfactory results. But, the scope for improving

More information

AN IMPROVED ITERATIVE METHOD FOR SOLVING GENERAL SYSTEM OF EQUATIONS VIA GENETIC ALGORITHMS

AN IMPROVED ITERATIVE METHOD FOR SOLVING GENERAL SYSTEM OF EQUATIONS VIA GENETIC ALGORITHMS AN IMPROVED ITERATIVE METHOD FOR SOLVING GENERAL SYSTEM OF EQUATIONS VIA GENETIC ALGORITHMS Seyed Abolfazl Shahzadehfazeli 1, Zainab Haji Abootorabi,3 1 Parallel Processing Laboratory, Yazd University,

More information

An Integrated Genetic Algorithm with Clone Operator

An Integrated Genetic Algorithm with Clone Operator International Journal of Pure and Applied Mathematical Sciences. ISSN 0972-9828 Volume 9, Number 2 (2016), pp. 145-164 Research India Publications http://www.ripublication.com An Integrated Genetic Algorithm

More information

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search A JOB-SHOP SCHEDULING PROBLEM (JSSP) USING GENETIC ALGORITHM (GA) Mahanim Omar, Adam Baharum, Yahya Abu Hasan School of Mathematical Sciences, Universiti Sains Malaysia 11800 Penang, Malaysia Tel: (+)

More information

Time Complexity Analysis of the Genetic Algorithm Clustering Method

Time Complexity Analysis of the Genetic Algorithm Clustering Method Time Complexity Analysis of the Genetic Algorithm Clustering Method Z. M. NOPIAH, M. I. KHAIRIR, S. ABDULLAH, M. N. BAHARIN, and A. ARIFIN Department of Mechanical and Materials Engineering Universiti

More information

Solving A Nonlinear Side Constrained Transportation Problem. by Using Spanning Tree-based Genetic Algorithm. with Fuzzy Logic Controller

Solving A Nonlinear Side Constrained Transportation Problem. by Using Spanning Tree-based Genetic Algorithm. with Fuzzy Logic Controller Solving A Nonlinear Side Constrained Transportation Problem by Using Spanning Tree-based Genetic Algorithm with Fuzzy Logic Controller Yasuhiro Tsujimura *, Mitsuo Gen ** and Admi Syarif **,*** * Department

More information

Genetic Algorithm Performance with Different Selection Methods in Solving Multi-Objective Network Design Problem

Genetic Algorithm Performance with Different Selection Methods in Solving Multi-Objective Network Design Problem etic Algorithm Performance with Different Selection Methods in Solving Multi-Objective Network Design Problem R. O. Oladele Department of Computer Science University of Ilorin P.M.B. 1515, Ilorin, NIGERIA

More information

Evolutionary Computation Algorithms for Cryptanalysis: A Study

Evolutionary Computation Algorithms for Cryptanalysis: A Study Evolutionary Computation Algorithms for Cryptanalysis: A Study Poonam Garg Information Technology and Management Dept. Institute of Management Technology Ghaziabad, India pgarg@imt.edu Abstract The cryptanalysis

More information

A Genetic Algorithm for Multiprocessor Task Scheduling

A Genetic Algorithm for Multiprocessor Task Scheduling A Genetic Algorithm for Multiprocessor Task Scheduling Tashniba Kaiser, Olawale Jegede, Ken Ferens, Douglas Buchanan Dept. of Electrical and Computer Engineering, University of Manitoba, Winnipeg, MB,

More information

MINIMAL EDGE-ORDERED SPANNING TREES USING A SELF-ADAPTING GENETIC ALGORITHM WITH MULTIPLE GENOMIC REPRESENTATIONS

MINIMAL EDGE-ORDERED SPANNING TREES USING A SELF-ADAPTING GENETIC ALGORITHM WITH MULTIPLE GENOMIC REPRESENTATIONS Proceedings of Student/Faculty Research Day, CSIS, Pace University, May 5 th, 2006 MINIMAL EDGE-ORDERED SPANNING TREES USING A SELF-ADAPTING GENETIC ALGORITHM WITH MULTIPLE GENOMIC REPRESENTATIONS Richard

More information

The study of comparisons of three crossover operators in genetic algorithm for solving single machine scheduling problem. Quan OuYang, Hongyun XU a*

The study of comparisons of three crossover operators in genetic algorithm for solving single machine scheduling problem. Quan OuYang, Hongyun XU a* International Conference on Manufacturing Science and Engineering (ICMSE 2015) The study of comparisons of three crossover operators in genetic algorithm for solving single machine scheduling problem Quan

More information

STATISTICS (STAT) Statistics (STAT) 1

STATISTICS (STAT) Statistics (STAT) 1 Statistics (STAT) 1 STATISTICS (STAT) STAT 2013 Elementary Statistics (A) Prerequisites: MATH 1483 or MATH 1513, each with a grade of "C" or better; or an acceptable placement score (see placement.okstate.edu).

More information

SUITABLE CONFIGURATION OF EVOLUTIONARY ALGORITHM AS BASIS FOR EFFICIENT PROCESS PLANNING TOOL

SUITABLE CONFIGURATION OF EVOLUTIONARY ALGORITHM AS BASIS FOR EFFICIENT PROCESS PLANNING TOOL DAAAM INTERNATIONAL SCIENTIFIC BOOK 2015 pp. 135-142 Chapter 12 SUITABLE CONFIGURATION OF EVOLUTIONARY ALGORITHM AS BASIS FOR EFFICIENT PROCESS PLANNING TOOL JANKOWSKI, T. Abstract: The paper presents

More information

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2 Chapter 5 A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2 Graph Matching has attracted the exploration of applying new computing paradigms because of the large number of applications

More information

METAHEURISTICS. Introduction. Introduction. Nature of metaheuristics. Local improvement procedure. Example: objective function

METAHEURISTICS. Introduction. Introduction. Nature of metaheuristics. Local improvement procedure. Example: objective function Introduction METAHEURISTICS Some problems are so complicated that are not possible to solve for an optimal solution. In these problems, it is still important to find a good feasible solution close to the

More information

A Genetic Algorithm for the Multiple Knapsack Problem in Dynamic Environment

A Genetic Algorithm for the Multiple Knapsack Problem in Dynamic Environment , 23-25 October, 2013, San Francisco, USA A Genetic Algorithm for the Multiple Knapsack Problem in Dynamic Environment Ali Nadi Ünal Abstract The 0/1 Multiple Knapsack Problem is an important class of

More information

Four Methods for Maintenance Scheduling

Four Methods for Maintenance Scheduling Four Methods for Maintenance Scheduling Edmund K. Burke, University of Nottingham, ekb@cs.nott.ac.uk John A. Clark, University of York, jac@minster.york.ac.uk Alistair J. Smith, University of Nottingham,

More information

Genetic Algorithms and Image Search Pavel Mrázek

Genetic Algorithms and Image Search Pavel Mrázek Genetic Algorithms and Image Search Pavel Mrázek Department of Computer Science, Faculty of Electrical Engineering, Czech Technical University (»VUT), Karlovo nám. 13, 12135 Praha 2, Czech Republic e-mail:

More information

Fast Efficient Clustering Algorithm for Balanced Data

Fast Efficient Clustering Algorithm for Balanced Data Vol. 5, No. 6, 214 Fast Efficient Clustering Algorithm for Balanced Data Adel A. Sewisy Faculty of Computer and Information, Assiut University M. H. Marghny Faculty of Computer and Information, Assiut

More information

Real Coded Genetic Algorithm Particle Filter for Improved Performance

Real Coded Genetic Algorithm Particle Filter for Improved Performance Proceedings of 2012 4th International Conference on Machine Learning and Computing IPCSIT vol. 25 (2012) (2012) IACSIT Press, Singapore Real Coded Genetic Algorithm Particle Filter for Improved Performance

More information

Evolutionary Computation Part 2

Evolutionary Computation Part 2 Evolutionary Computation Part 2 CS454, Autumn 2017 Shin Yoo (with some slides borrowed from Seongmin Lee @ COINSE) Crossover Operators Offsprings inherit genes from their parents, but not in identical

More information

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS Gabriela Ochoa http://www.cs.stir.ac.uk/~goc/ OUTLINE Optimisation problems Optimisation & search Two Examples The knapsack problem

More information

Genetic Algorithms Variations and Implementation Issues

Genetic Algorithms Variations and Implementation Issues Genetic Algorithms Variations and Implementation Issues CS 431 Advanced Topics in AI Classic Genetic Algorithms GAs as proposed by Holland had the following properties: Randomly generated population Binary

More information

Network Routing Protocol using Genetic Algorithms

Network Routing Protocol using Genetic Algorithms International Journal of Electrical & Computer Sciences IJECS-IJENS Vol:0 No:02 40 Network Routing Protocol using Genetic Algorithms Gihan Nagib and Wahied G. Ali Abstract This paper aims to develop a

More information

MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS

MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS In: Journal of Applied Statistical Science Volume 18, Number 3, pp. 1 7 ISSN: 1067-5817 c 2011 Nova Science Publishers, Inc. MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS Füsun Akman

More information

[Premalatha, 4(5): May, 2015] ISSN: (I2OR), Publication Impact Factor: (ISRA), Journal Impact Factor: 2.114

[Premalatha, 4(5): May, 2015] ISSN: (I2OR), Publication Impact Factor: (ISRA), Journal Impact Factor: 2.114 IJESRT INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY GENETIC ALGORITHM FOR OPTIMIZATION PROBLEMS C. Premalatha Assistant Professor, Department of Information Technology Sri Ramakrishna

More information

Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding

Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding e Scientific World Journal, Article ID 746260, 8 pages http://dx.doi.org/10.1155/2014/746260 Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding Ming-Yi

More information

Genetic algorithms and finite element coupling for mechanical optimization

Genetic algorithms and finite element coupling for mechanical optimization Computer Aided Optimum Design in Engineering X 87 Genetic algorithms and finite element coupling for mechanical optimization G. Corriveau, R. Guilbault & A. Tahan Department of Mechanical Engineering,

More information

Using Genetic Algorithms in Integer Programming for Decision Support

Using Genetic Algorithms in Integer Programming for Decision Support Doi:10.5901/ajis.2014.v3n6p11 Abstract Using Genetic Algorithms in Integer Programming for Decision Support Dr. Youcef Souar Omar Mouffok Taher Moulay University Saida, Algeria Email:Syoucef12@yahoo.fr

More information

A Genetic Algorithm Approach to the Group Technology Problem

A Genetic Algorithm Approach to the Group Technology Problem IMECS 008, 9- March, 008, Hong Kong A Genetic Algorithm Approach to the Group Technology Problem Hatim H. Sharif, Khaled S. El-Kilany, and Mostafa A. Helaly Abstract In recent years, the process of cellular

More information

Distributed Probabilistic Model-Building Genetic Algorithm

Distributed Probabilistic Model-Building Genetic Algorithm Distributed Probabilistic Model-Building Genetic Algorithm Tomoyuki Hiroyasu 1, Mitsunori Miki 1, Masaki Sano 1, Hisashi Shimosaka 1, Shigeyoshi Tsutsui 2, and Jack Dongarra 3 1 Doshisha University, Kyoto,

More information

A New Selection Operator - CSM in Genetic Algorithms for Solving the TSP

A New Selection Operator - CSM in Genetic Algorithms for Solving the TSP A New Selection Operator - CSM in Genetic Algorithms for Solving the TSP Wael Raef Alkhayri Fahed Al duwairi High School Aljabereyah, Kuwait Suhail Sami Owais Applied Science Private University Amman,

More information

A Genetic Algorithm for Minimum Tetrahedralization of a Convex Polyhedron

A Genetic Algorithm for Minimum Tetrahedralization of a Convex Polyhedron A Genetic Algorithm for Minimum Tetrahedralization of a Convex Polyhedron Kiat-Choong Chen Ian Hsieh Cao An Wang Abstract A minimum tetrahedralization of a convex polyhedron is a partition of the convex

More information

Preprocessing of Stream Data using Attribute Selection based on Survival of the Fittest

Preprocessing of Stream Data using Attribute Selection based on Survival of the Fittest Preprocessing of Stream Data using Attribute Selection based on Survival of the Fittest Bhakti V. Gavali 1, Prof. Vivekanand Reddy 2 1 Department of Computer Science and Engineering, Visvesvaraya Technological

More information

Algorithm Design (4) Metaheuristics

Algorithm Design (4) Metaheuristics Algorithm Design (4) Metaheuristics Takashi Chikayama School of Engineering The University of Tokyo Formalization of Constraint Optimization Minimize (or maximize) the objective function f(x 0,, x n )

More information

A HYBRID APPROACH IN GENETIC ALGORITHM: COEVOLUTION OF THREE VECTOR SOLUTION ENCODING. A CASE-STUDY

A HYBRID APPROACH IN GENETIC ALGORITHM: COEVOLUTION OF THREE VECTOR SOLUTION ENCODING. A CASE-STUDY A HYBRID APPROACH IN GENETIC ALGORITHM: COEVOLUTION OF THREE VECTOR SOLUTION ENCODING. A CASE-STUDY Dmitriy BORODIN, Victor GORELIK, Wim DE BRUYN and Bert VAN VRECKEM University College Ghent, Ghent, Belgium

More information

A LOCAL SEARCH GENETIC ALGORITHM FOR THE JOB SHOP SCHEDULING PROBLEM

A LOCAL SEARCH GENETIC ALGORITHM FOR THE JOB SHOP SCHEDULING PROBLEM A LOCAL SEARCH GENETIC ALGORITHM FOR THE JOB SHOP SCHEDULING PROBLEM Kebabla Mebarek, Mouss Leila Hayat and Mouss Nadia Laboratoire d'automatique et productique, Université Hadj Lakhdar -Batna kebabla@yahoo.fr,

More information

Using Simple Ancestry to Deter Inbreeding for Persistent Genetic Algorithm Search

Using Simple Ancestry to Deter Inbreeding for Persistent Genetic Algorithm Search Using Simple Ancestry to Deter Inbreeding for Persistent Genetic Algorithm Search Aditya Wibowo and Peter Jamieson Dept. of Electrical and Computer Engineering Miami University Abstract In this work, we

More information

The Parallel Software Design Process. Parallel Software Design

The Parallel Software Design Process. Parallel Software Design Parallel Software Design The Parallel Software Design Process Deborah Stacey, Chair Dept. of Comp. & Info Sci., University of Guelph dastacey@uoguelph.ca Why Parallel? Why NOT Parallel? Why Talk about

More information

BI-OBJECTIVE EVOLUTIONARY ALGORITHM FOR FLEXIBLE JOB-SHOP SCHEDULING PROBLEM. Minimizing Make Span and the Total Workload of Machines

BI-OBJECTIVE EVOLUTIONARY ALGORITHM FOR FLEXIBLE JOB-SHOP SCHEDULING PROBLEM. Minimizing Make Span and the Total Workload of Machines International Journal of Mathematics and Computer Applications Research (IJMCAR) ISSN 2249-6955 Vol. 2 Issue 4 Dec - 2012 25-32 TJPRC Pvt. Ltd., BI-OBJECTIVE EVOLUTIONARY ALGORITHM FOR FLEXIBLE JOB-SHOP

More information

Genetic Algorithms. Kang Zheng Karl Schober

Genetic Algorithms. Kang Zheng Karl Schober Genetic Algorithms Kang Zheng Karl Schober Genetic algorithm What is Genetic algorithm? A genetic algorithm (or GA) is a search technique used in computing to find true or approximate solutions to optimization

More information

Artificial Intelligence Application (Genetic Algorithm)

Artificial Intelligence Application (Genetic Algorithm) Babylon University College of Information Technology Software Department Artificial Intelligence Application (Genetic Algorithm) By Dr. Asaad Sabah Hadi 2014-2015 EVOLUTIONARY ALGORITHM The main idea about

More information

A Development of Hybrid Cross Entropy-Tabu Search Algorithm for Travelling Repairman Problem

A Development of Hybrid Cross Entropy-Tabu Search Algorithm for Travelling Repairman Problem Proceedings of the 2012 International Conference on Industrial Engineering and Operations Management Istanbul, Turkey, July 3 6, 2012 A Development of Hybrid Cross Entropy-Tabu Search Algorithm for Travelling

More information

Abstract. 1 Introduction

Abstract. 1 Introduction A Robust Real-Coded Genetic Algorithm using Unimodal Normal Distribution Crossover Augmented by Uniform Crossover : Effects of Self-Adaptation of Crossover Probabilities Isao Ono Faculty of Engineering,

More information

Using Genetic Algorithms to optimize ACS-TSP

Using Genetic Algorithms to optimize ACS-TSP Using Genetic Algorithms to optimize ACS-TSP Marcin L. Pilat and Tony White School of Computer Science, Carleton University, 1125 Colonel By Drive, Ottawa, ON, K1S 5B6, Canada {mpilat,arpwhite}@scs.carleton.ca

More information

Introduction to Genetic Algorithms. Based on Chapter 10 of Marsland Chapter 9 of Mitchell

Introduction to Genetic Algorithms. Based on Chapter 10 of Marsland Chapter 9 of Mitchell Introduction to Genetic Algorithms Based on Chapter 10 of Marsland Chapter 9 of Mitchell Genetic Algorithms - History Pioneered by John Holland in the 1970s Became popular in the late 1980s Based on ideas

More information

Santa Fe Trail Problem Solution Using Grammatical Evolution

Santa Fe Trail Problem Solution Using Grammatical Evolution 2012 International Conference on Industrial and Intelligent Information (ICIII 2012) IPCSIT vol.31 (2012) (2012) IACSIT Press, Singapore Santa Fe Trail Problem Solution Using Grammatical Evolution Hideyuki

More information

Neural Network Weight Selection Using Genetic Algorithms

Neural Network Weight Selection Using Genetic Algorithms Neural Network Weight Selection Using Genetic Algorithms David Montana presented by: Carl Fink, Hongyi Chen, Jack Cheng, Xinglong Li, Bruce Lin, Chongjie Zhang April 12, 2005 1 Neural Networks Neural networks

More information

Evolving SQL Queries for Data Mining

Evolving SQL Queries for Data Mining Evolving SQL Queries for Data Mining Majid Salim and Xin Yao School of Computer Science, The University of Birmingham Edgbaston, Birmingham B15 2TT, UK {msc30mms,x.yao}@cs.bham.ac.uk Abstract. This paper

More information

Stable Specification Searches in Structural Equation Modeling Using a Multi-objective Evolutionary Algorithm

Stable Specification Searches in Structural Equation Modeling Using a Multi-objective Evolutionary Algorithm Stable Specification Searches in Structural Equation Modeling Using a Multi-objective Evolutionary Algorithm Ridho Rahmadi 1,2, Perry Groot 2, Tom Heskes 2 1 Department of Informatics, Faculty of Industrial

More information

Genetic Algorithm for Circuit Partitioning

Genetic Algorithm for Circuit Partitioning Genetic Algorithm for Circuit Partitioning ZOLTAN BARUCH, OCTAVIAN CREŢ, KALMAN PUSZTAI Computer Science Department, Technical University of Cluj-Napoca, 26, Bariţiu St., 3400 Cluj-Napoca, Romania {Zoltan.Baruch,

More information

A Genetic Based Algorithm to Generate Random Simple Polygons Using a New Polygon Merge Algorithm

A Genetic Based Algorithm to Generate Random Simple Polygons Using a New Polygon Merge Algorithm A Genetic Based Algorithm to Generate Random Simple Polygons Using a New Polygon Merge Algorithm Ali Nourollah, Mohsen Movahedinejad Abstract In this paper a new algorithm to generate random simple polygons

More information

A Hybrid Genetic Algorithm for the Distributed Permutation Flowshop Scheduling Problem Yan Li 1, a*, Zhigang Chen 2, b

A Hybrid Genetic Algorithm for the Distributed Permutation Flowshop Scheduling Problem Yan Li 1, a*, Zhigang Chen 2, b International Conference on Information Technology and Management Innovation (ICITMI 2015) A Hybrid Genetic Algorithm for the Distributed Permutation Flowshop Scheduling Problem Yan Li 1, a*, Zhigang Chen

More information

International Journal of Scientific & Engineering Research Volume 8, Issue 10, October-2017 ISSN

International Journal of Scientific & Engineering Research Volume 8, Issue 10, October-2017 ISSN 194 Prime Number Generation Using Genetic Algorithm Arpit Goel 1, Anuradha Brijwal 2, Sakshi Gautam 3 1 Dept. Of Computer Science & Engineering, Himalayan School of Engineering & Technology, Swami Rama

More information

Supplementary Notes on Multiple Imputation. Stephen du Toit and Gerhard Mels Scientific Software International

Supplementary Notes on Multiple Imputation. Stephen du Toit and Gerhard Mels Scientific Software International Supplementary Notes on Multiple Imputation. Stephen du Toit and Gerhard Mels Scientific Software International Part A: Comparison with FIML in the case of normal data. Stephen du Toit Multivariate data

More information

Path Planning Optimization Using Genetic Algorithm A Literature Review

Path Planning Optimization Using Genetic Algorithm A Literature Review International Journal of Computational Engineering Research Vol, 03 Issue, 4 Path Planning Optimization Using Genetic Algorithm A Literature Review 1, Er. Waghoo Parvez, 2, Er. Sonal Dhar 1, (Department

More information

Abstract. 1 Introduction

Abstract. 1 Introduction Shape optimal design using GA and BEM Eisuke Kita & Hisashi Tanie Department of Mechano-Informatics and Systems, Nagoya University, Nagoya 464-01, Japan Abstract This paper describes a shape optimization

More information

Genetic Algorithms for Vision and Pattern Recognition

Genetic Algorithms for Vision and Pattern Recognition Genetic Algorithms for Vision and Pattern Recognition Faiz Ul Wahab 11/8/2014 1 Objective To solve for optimization of computer vision problems using genetic algorithms 11/8/2014 2 Timeline Problem: Computer

More information

AN EVOLUTIONARY APPROACH TO DISTANCE VECTOR ROUTING

AN EVOLUTIONARY APPROACH TO DISTANCE VECTOR ROUTING International Journal of Latest Research in Science and Technology Volume 3, Issue 3: Page No. 201-205, May-June 2014 http://www.mnkjournals.com/ijlrst.htm ISSN (Online):2278-5299 AN EVOLUTIONARY APPROACH

More information

IMPROVING A GREEDY DNA MOTIF SEARCH USING A MULTIPLE GENOMIC SELF-ADAPTATING GENETIC ALGORITHM

IMPROVING A GREEDY DNA MOTIF SEARCH USING A MULTIPLE GENOMIC SELF-ADAPTATING GENETIC ALGORITHM Proceedings of Student/Faculty Research Day, CSIS, Pace University, May 4th, 2007 IMPROVING A GREEDY DNA MOTIF SEARCH USING A MULTIPLE GENOMIC SELF-ADAPTATING GENETIC ALGORITHM Michael L. Gargano, mgargano@pace.edu

More information

Revision of a Floating-Point Genetic Algorithm GENOCOP V for Nonlinear Programming Problems

Revision of a Floating-Point Genetic Algorithm GENOCOP V for Nonlinear Programming Problems 4 The Open Cybernetics and Systemics Journal, 008,, 4-9 Revision of a Floating-Point Genetic Algorithm GENOCOP V for Nonlinear Programming Problems K. Kato *, M. Sakawa and H. Katagiri Department of Artificial

More information

arxiv: v1 [cs.ai] 12 Feb 2017

arxiv: v1 [cs.ai] 12 Feb 2017 GENETIC AND MEMETIC ALGORITHM WITH DIVERSITY EQUILIBRIUM BASED ON GREEDY DIVERSIFICATION ANDRÉS HERRERA-POYATOS 1 AND FRANCISCO HERRERA 1,2 arxiv:1702.03594v1 [cs.ai] 12 Feb 2017 1 Research group Soft

More information

Optimal Facility Layout Problem Solution Using Genetic Algorithm

Optimal Facility Layout Problem Solution Using Genetic Algorithm Optimal Facility Layout Problem Solution Using Genetic Algorithm Maricar G. Misola and Bryan B. Navarro Abstract Facility Layout Problem (FLP) is one of the essential problems of several types of manufacturing

More information

AI Programming CS S-08 Local Search / Genetic Algorithms

AI Programming CS S-08 Local Search / Genetic Algorithms AI Programming CS662-2013S-08 Local Search / Genetic Algorithms David Galles Department of Computer Science University of San Francisco 08-0: Overview Local Search Hill-Climbing Search Simulated Annealing

More information

Towards Automatic Recognition of Fonts using Genetic Approach

Towards Automatic Recognition of Fonts using Genetic Approach Towards Automatic Recognition of Fonts using Genetic Approach M. SARFRAZ Department of Information and Computer Science King Fahd University of Petroleum and Minerals KFUPM # 1510, Dhahran 31261, Saudi

More information

CS5401 FS2015 Exam 1 Key

CS5401 FS2015 Exam 1 Key CS5401 FS2015 Exam 1 Key This is a closed-book, closed-notes exam. The only items you are allowed to use are writing implements. Mark each sheet of paper you use with your name and the string cs5401fs2015

More information

Image Processing algorithm for matching horizons across faults in seismic data

Image Processing algorithm for matching horizons across faults in seismic data Image Processing algorithm for matching horizons across faults in seismic data Melanie Aurnhammer and Klaus Tönnies Computer Vision Group, Otto-von-Guericke University, Postfach 410, 39016 Magdeburg, Germany

More information

Artificial Intelligence

Artificial Intelligence Artificial Intelligence Informed Search and Exploration Chapter 4 (4.3 4.6) Searching: So Far We ve discussed how to build goal-based and utility-based agents that search to solve problems We ve also presented

More information

A Modified Genetic Algorithm for Process Scheduling in Distributed System

A Modified Genetic Algorithm for Process Scheduling in Distributed System A Modified Genetic Algorithm for Process Scheduling in Distributed System Vinay Harsora B.V.M. Engineering College Charatar Vidya Mandal Vallabh Vidyanagar, India Dr.Apurva Shah G.H.Patel College of Engineering

More information

Binary Representations of Integers and the Performance of Selectorecombinative Genetic Algorithms

Binary Representations of Integers and the Performance of Selectorecombinative Genetic Algorithms Binary Representations of Integers and the Performance of Selectorecombinative Genetic Algorithms Franz Rothlauf Department of Information Systems University of Bayreuth, Germany franz.rothlauf@uni-bayreuth.de

More information