The Acyclic Bayesian Net Generator (Student Paper)

Size: px
Start display at page:

Download "The Acyclic Bayesian Net Generator (Student Paper)"

Transcription

1 The Acyclic Bayesian Net Generator (Student Paper) Pankaj B. Gupta and Vicki H. Allan Microsoft Corporation, One Microsoft Way, Redmond, WA 98, USA, Computer Science Department, Utah State University, Logan, UT 8, Abstract. We present the Acyclic Bayesian Net Generator, a new approach to learn the structure of a Bayesian network using genetic algorithms. Due to the encoding mechanism, acyclicity is preserved through mutation and crossover. We present a detailed description of how our method works and explain why it is better than previous approaches. We can efficiently perform crossover on chromosomes with different node orders without the danger of cycle formation. The approach is capable of learning over all variable node orderings and structures. We also present a proof that our technique of choosing the initial population semi-randomly ensures that the genetic algorithm searches over the whole solution space. Tests show that the method is effective. Keywords: Structure Learning, Bayesian Networks, Genetic Algorithms. Introduction Many intelligent systems attempt to discover the relationships between data []. Often data is available for the parameters of a problem area but no information is available about the relations between them. Bayesian networks are graphical models capable of representing relations between the parameters of a problem in a way that is easy for humans to interpret and visualize. Bayesian networks encode relations that are intuitive to humans and are also suitable for statistical analysis. The nodes in a Bayesian network represent the parameters, and the directed edges represent the cause-effect relationships between them. Learning the structure of a Bayesian network is a challenging problem. Genetic algorithms are evolutionary algorithms capable of solving problems with large solution spaces. To use genetic algorithms for a problem area, it requires the following: an Encoding Scheme, a Score function, a Crossover function and a Mutation function. A way of representing a possible solution as a low level data structure, such as an array of bits, is known as the encoding scheme. An encoded solution is called a chromosome. A set of chromosomes which are contestants for producing the optimal solution is called a population. A crossover function mixes two chromosomes (called parent chromosomes) and regenerates two chromosomes (called daughter chromosomes). A mutation function randomly changes a chromosome. A Score function is a measure of the

2 quality of the solution (chromosome). Initially a population of random chromosomes is generated. In each iteration of the genetic algorithm, a set of daughter chromosomes is created by crossing over pairs of parents chosen randomly from the population. The set of daughters chromosomes is added to the population and the population is reduced to the original size by discarding the worst quality chromosomes. This process is repeated until a quality chromosome is generated. Learning the structure of a Bayesian network from data has a large solution space and is known to be NP-Hard [, 6]. Thus, genetic algorithms provide an effective solution for the problem. In this research, we present a new method that learns the structure of a Bayesian network using genetic algorithms. We name our method the Acyclic Bayesian Net Generator. We preserve acyclicity without restricting node order. The initial population has a significant role in the success of a genetic algorithm. We present a technique to generate the initial population, in a semirandom manner, to ensure that the search is performed over the whole solution space. Previous Work Initial attempts at using genetic algorithms for learning the structure of Bayesian networks are credited to Larranaga, et al. [8, 9]. They use a structure of booleans which they term a connectivity matrix {c ij }. In this matrix, c ij = indicates that i is a parent of j. In order to preserve an acyclic Bayesian net, either an existing node ordering is assumed for the learning algorithm (and parents must be selected from nodes earlier in the order), or a repair operator is applied to delete cycle-causing edges from cyclic networks formed during the process. Myers, et. al., use genetic algorithms to learn the structure of Bayesian networks from incomplete data []. Adjacency lists are used to describe the parents of a node of the Bayesian network. Illegal Bayesian networks generated in the process are handled by assigning them a low score. Guo, et al., use genetic algorithm to tune the node ordering of a Bayesian network []. The node ordering serves an input to the K algorithm []. There are several improvements in our approach, the Acyclic Bayesian Net Generator. Our encoding scheme is inherently acyclic, and thus, we do not have to deal with any cyclic networks. Neither do we assume any pre-defined node ordering, nor do we delete arcs (and thus lose information) in maintaining network acyclicity. Our crossover and mutation functions are closed operations. Our method has the advantage that we optimize the complete solution rather than optimizing pieces and attempting a combination, which is typically a less than optimal approach. Definitions, Assumptions For this research, n represents the number of nodes of a Bayesian network. The nodes are labeled from to n. Chromosomes are represented with capital

3 letters such as A, B, C. Sets of chromosomes are represented by bold letters such as U, V. A chromosome C for a Bayesian network of n nodes is represented as C n. We use the term C to refer to the node order associated with a chromosome C. Bayesian Network Node : Encoded Chromosome C 8 C.Gene C.Gene 7 C.Gene C.Gene 6 9 Node Parents Fig.. A Bayesian Network, A Topological Node And The Corresponding Encoded Chromosome The Encoding Scheme A chromosome defines both a Bayesian network and a topological order on nodes of the Bayesian network. Associating a node order with a chromosome has the following advantages: First, by crossing over chromosomes of different node orders, our algorithm learns the optimal node ordering, and second, acyclic nature is preserved over operations like crossover and mutation. Consider a Bayesian network of size n. Let C be the corresponding encoded chromosome. C is an array of n genes, C.Gene, C.Gene,..., C.Gene n, such that there is one gene corresponding to each node in the Bayesian network. Figure shows an encoding for a sample Bayesian network. In the figure, each row of the encoded chromosome represents a gene. Each gene contains the following two pieces of information: Node: The i th gene of the chromosome C corresponds to the i th node in the corresponding topological node order. Node information is stored in the gene and is represented as C i. The sequence, C, C,..., C n, represents the node order (C ). In Figure, the gene C.Gene corresponds to node 6, thus C is 6. Parents: The parent information of chromosome C is a connectivity matrix {C i,j } with i < j and C i,j ɛ{, }. C i,j = if and only if C j is a parent of

4 C i. The i th row of the connectivity matrix represents the parent information associated with i th gene. In Figure, nodes, 8, 7, are the possible parents of node 6 and node 8 is the only actual parent of node 6. Crossover Point = Chromosome A Node : Bayesian Network A Chromosome B Node : Bayesian Network B Fig.. Parent Chromosomes Crossover Function Our O(n ) crossover function ensures that the daughter chromosomes produced are acyclic in nature. Let the two parent chromosomes be A and B, and the two daughter chromosomes be X and Y. The crossover function can be thought of as a two step process: () producing node orders, and () assigning parents.. Step : Daughter s Figure shows two parent chromosomes, the associated Bayesian networks and node orders. Our crossover point, cp, divides the nodes into two pieces and is determined randomly. Based on the crossover point, A is split into A and A. A has nodes A... A cp and A has nodes A cp... A n. In figure, our crossover point is. The node order for chromosome B is divided into two parts B and B, such that B has the same nodes as in A (but the order within B is preserved), and B has the same nodes as in A (but the order within B is preserved). To generate the node ordering of the two daughter chromosomes, we concatenate A with B to get X, and B with A to get Y. Figure shows this process for the parent chromosomes in Figure. This step enables our Acyclic Bayesian Net Generator to build from different node orders to learn the optimal one. Later in this paper, we present a proof that our learning algorithm is capable of iterating over every node order.

5 Division of Node of Parent Chromosomes: New Node s for Daughter Chromosomes: A A For Chromosome A based on Crossover Point A. B For Chromosome X B B For Chromosome B based on A and A B. A For Chromosome Y Fig.. Generating New Topological Node. Step : Parent Information Once we have determined X and Y, we need to generate the parent information for the nodes in X and Y. This step can generate two different sets of parent information for the daughter chromosomes. It is decided randomly which set of daughter chromosomes is actually produced. We refer to the methods as M and M. Let x and x be two nodes such that x precedes x in X. Remember that X is formed by concatenating A and B. In daughter chromosome X, x is a parent of x if and only if one of the following conditions is met.. Either, both x and x belong to A and x is a parent of x in A, or both x and x belong to B, and x is a parent of x in B (solid edges in Figure ).. x belongs to A and x belongs to B, and Method M: x is a parent of x in A (dotted edges in Figure ). Method M: either x is a parent of x in B (large dashed lines in in Figure ), or, x is a parent of x in B (small dashed edges in Figure ). The steps for generating daughter chromosome Y differ slightly. Let y and y be two nodes such that y precedes y in Y. In daughter chromosome Y, y is a parent of y if and only if one of the following conditions is met.. Either, both y and y belong to B and y is a parent of y in B, or both y and y belong to A, and y is a parent of y in A (solid edges in Figure ).. x belongs to A and x belongs to B, and Method M: either y is a parent of y in B (large dashed lines in in Figure ), or, y is a parent of y in B (small dashed edges in Figure ). Method M: y is a parent of y in A (dotted edges in Figure ). In condition, both x and x are initially a part of the same parent chromosome, thus we have transferred the parent relation from that parent chromosome. In condition, node x is from A and x is a part of B. We have to transfer the parent relation between x and x to X either from A or B. This

6 A B (a) Parent Chromosomes X Y (b) Daughters Using M X Y (c) Daughters using M Fig.. Generating Parent Information choice is made at random. If A is chosen, we say that we are using method M. In case of method M, the parent relation from B is used. Though the steps for generating Y appear to be similar, they differ slightly. Condition is very similar to that of chromosome X. In condition, the roles of M and M are reversed. This ensures that in this case we use A (or B) to transfer the parent information to Y, if B (or A) is used for X in a similar situation. Figure shows the behavior of our crossover function when two chromosomes of the same node order are crossed over. A B Using method M X Y A B Using Method M X Y Fig.. Crossover For Chromosomes of Same Node s Previous attempts of structure learning using genetic algorithms have to deal with the cyclic networks. In our method, the edges represented by small dashes in Figure have the potential to form cycles in the daughter chromosomes. To prevent cycles, two approaches can be considered. The first approach is to reverse such edges and the second is to delete them. Steck uses the concept of edge reversal in his learning algorithm [], whereas Larranaga, in his algorithm,

7 deletes the cycle forming edges [8]. We perform a test to investigate which of the two approaches is better. We take a sample data set for ASIA network [] and use it to learn the network using both approaches. Figure 6 shows the percentage of times the optimal ASIA network is learned. From the figure, it is evident that reversing an edge is a better approach than deleting an edge. Though reversing an edge between two nodes may not represent the causal relationship between them as before, often there is little difference if we assume A causes B rather than B causes A. Since we have a node order associated with our daughter chromosomes, cycles are prevented as the direction of potentially dangerous edges gets reversed intrinsically. Thus, it is ensured that we do not lose any parent information by deleting edges. For any two nodes of the Bayesian network, the parent information for the first daughter directly follows from the parent information between the same nodes in one of the two parent chromosomes. The parent information for the second daughter follows directly from the other parent chromosome. Each parent relation between nodes of the parent chromosomes is transferred to exactly one of the two daughter chromosomes. In the crossover operation, no edges are lost, nor do any new edges appear in the daughter chromosomes. During the process, nodes inherited from the same parent preserve the mutual relations as in that parent. This preserves the goodness of the chromosomes and proves particularly beneficial if one of the parent chromosome has captured the dependence between a few nodes and the other parent has captured dependence among the other nodes. Fig. 6. Comparison of two approaches to prevent cycle formation while learning the structure of ASIA network. In one approach, potentially dangerous edges are reversed, whereas in the other approach they are deleted. As explained above, during crossover, one of the two methods (M and M) is randomly chosen to generate daughter chromosomes. It is imperative that both

8 methods be used so that the learning process spans the whole solution space. To confirm this experimentally, we perform a test on a node subset of ALARM []. It is observed that the quality of the networks learned deteriorates when only one of the two methods is used. The Hamming Distance (HD) between two networks is a measure of the number of edges that are not common to both the networks. The best network learned by using only method M has a HD value of 8, whereas when both methods are used (all other parameters are kept constant), we learn a network that has a HD value of. We use the the operator to represent our crossover function. The equation A B cp,m X, Y is read as: A parent chromosome A crossed over with a parent chromosome B, at crossover point cp using method m (M or M), produces daughter chromosomes X and Y. We use the same notation if we want to talk only about node orders. However, it should be noted that generation of node orders for the daughter chromosomes is independent of the method used. A B cp X, Y Note, our crossover function is asymmetric. Thus, the order of inputs and outputs is important. Let S be a set of chromosomes or node orders. S represents the closure of S over the crossover function applied at every crossover point and using both the methods. Crossing over chromosomes with identical node order produces daughter chromosomes with the same order. This can be represented as the following axiom. Axiom : {A } = {A } 6 Generation Of Population We use the term spanning population to mean that through crossover the initial population is capable of producing every chromosome in the solution space. A spanning population is important for the success of a genetic algorithm. If care is not taken, it is easy to generate a population not rich enough to result in an optimal solution. We introduce the following functions to discuss the formation of our initial population: Invert generates a chromosome that has a node order which is the reverse of the input chromosome A. The connectivity matrix associated with the chromosome is not altered. We will show that Invert is required for our algorithm to be able to iterate over all node orders in the solution space. We use the following notation: Invert(A) = Â. Figure 7 shows the behavior of our crossover function when A is crossed over with its inverse order. Please note that we show only the node orders in figure 7. ToggleParents inverts the bits of the parent information of each gene to produce a new chromosome. ToggleParents is important to be able to produce every possible parent configuration for a given node order. For each node n that

9 A Reverse X Y Fig. 7. Crossover Of An With Its Reverse precedes node m in the node order of a chromosome A, n is a parent of m in ToggleParents(A) if and only if n is not a parent of m in Chromosome A. See Figure 8. The following notation is used: T ogglep arents(a) = A A Invert A Toggle Parents Toggle Parents A Invert A = A Fig. 8. Invert And ToggleParent Operations Axiom For any chromosome A: A = A Â = A Â = Â Our initial population is generated according to the following mechanism. Chromosomes generated randomly are added to the population. To make the population rich, for every chromosome A, we add to the population Â, A and Â. This ensures that every chromosome in the solution space is derivable from our initial population. Theorem Let U be the universal set of all chromosomes of n nodes and A is any chromosome of n nodes. { A, Â, A, Â } = U

10 The theorem states that starting with A and its three variants, and by repeatedly applying crossover, it is possible to generate every chromosome in the universal set U. To prove this theorem, we need to define and prove some lemmas. Lemma For any two chromosomes A and B : if A B cp,m X, Y A B cp,m X, Y In our crossover function, the parent relation between two nodes of the first daughter chromosome is derived directly from one of the two parent chromosomes. The relation between the same two nodes of the second daughter chromosome is derived from the other of the two parents. Though the direction of edge may be reversed to preserve the acyclic nature of the daughters, the edge still exists. Thus toggling all the parent bits in the parent chromosomes result in toggling parent bits for the daughter chromosomes. Lemma describes this argument mathematically. Lemma Let A, B and R be three node orders such that: R ɛ { A, B } and x is a new node. Letting be concatenation, x R ɛ { x A, x B } Consider that A and B are two node orders and x is a new node. The following equation is true by the definition of step one of our crossover function. cp A B C, D () cp+ x A x B x C, x D Lemma follows from Equation. If we concatenate x to all the steps taken to derive R from A and B of lemma, it is clear that x.r is derivable from x.a and x.b. Lemma Let U n be the universal set of node orders of n nodes and A be any node order of n nodes. This implies: {A, Â } = U n We prove lemma by induction on n. The lemma is trivially true for n =. Assume that it is true for n=k. Thus, for any chromosome B k : {B k, B k } = U k () Recall that the superscript on the chromosome represents the number of nodes of the chromosome. It will be sufficient to prove that for any A k+ and R k+ : R k+ ɛ {Ak+, Âk+ } ()

11 Let the first node of R k+ be the t+th node of A k+, A t = R. As shown in Figure 7, we perform crossover on A and  with t+ as the crossover point: A  t+ L, M () where M = A t,..., A, A t+,..., A k. We perform crossover on  and A with k t as the crossover point:  A k t N, O () where N = A k,..., A t+, A,..., A t. From equations and, we can see N = M. Crossing over M and N with as the crossover point: M M P, Q (6) where P = A t, A k..., A t+, A,..., A t and Q = A t, A t..., A, A t+,..., A k. Recall that A t = R. Thus P can be represented as R B k where B k = A k,...,a t+,a,...,a t. Consequently, Q can be represented as R B k. Thus, we can say: From lemma and equation : R B k ɛ {A k+, Âk+ } R B k ɛ {A k+, (7) Âk+ } R k+ ɛ {R B k, R B k } (8) Combining equations 7 and 8 proves the equation. Hence lemma is proved. Lemma Let Z be any chromosome and T be a chromosome formed by toggling any one parent bit in Z. This implies T ɛ {Z, Z} and T ɛ {Z, Z} For the following steps, we are crossing over chromosomes with identical node orders. Please refer to the Figure. Consider that T differs from Z only in the following bit, T i,j = Z i,j. To prove lemma, we will use two crossover operations using method M to arrive at a chromosome G that differs with Z in the j th parent bit of every gene. In the next two steps we use method M to swap out the i th gene of Z with the i th gene of G to generate T. The steps are as follows: Z Z j,m E, F and E Z j+,m G, H (9) where E i.p arents = Z i,,..., Z i,j, Z i,j,..., Z i,i and G i.p arents = Z i,,..., Z i,j, Z i,j, Z i,j+,..., Z i,i. Recall that we are working with chromosomes with identical node orders. Thus, G.Gene i = T.Gene i. Z G i,m I, J and I Z i+,m K, L ()

12 where I = Z.Gene,..., Z.Gene i, T.Gene i, G.Gene i+,..., G.Gene n and K = Z.Gene,..., Z.Gene i, T.Gene i, Z.Gene i+,..., Z.Gene n. It is evident that K = T. Thus, from equations 9 and we can say T ɛ{z, Z}. Also, from lemma we can deduce T ɛ {Z, Z}. From axiom, Z = Z. Thus lemma has been proved. Lemma Let Z be any chromosome and V be the universal set of all chromosomes with Z as their node order. This implies: {Z, Z} = V This lemma is an extension of lemma. Any chromosome in V is reachable from S by applying lemma multiple times. Now we can prove theorem by the following argument. Consider S be any chromosome from the universal set U mentioned in theorem. Lemma states that starting with A and Â, it is possible to reach some chromosome Q whose node order is same as S. Mathematically, Q s.t. Q = S and Q ɛ {A, Â}. Also, from lemma, we can say Q ɛ {A, Â}. Since the node orders of S and Q are same, we apply lemma to deduce S ɛ {Q, Q}, and thus, S ɛ {A, Â, A, Â}. To begin with, S was any chromosome from the universal set U. Thus we can say, U = {A, Â, A, Â}. We have proved that starting with our initial population it is theoretically possible to reach every chromosome of the solution space, a necessary (but not sufficient) condition for a good genetic algorithm. In order to test the importance of enriching the population, we compare the results of the networks learned when a randomly generated population (of size p) is used vs. when the random population is enriched and the enriched population (of size p) is used. To compare the results on a fair basis, in each iteration of the latter learning process, we generate only p (instead of p) daughters. It is observed that with an enriched population better networks are learned. To present the results for this test we use a 6 node network. For the network, we are able to learn the optimal structure using an enriched population, whereas the best network learned in the case of a random population has a Hamming Distance value of. We also compare the behavior of our algorithm when samesized random and enriched populations are used. For small population sizes, the networks learned using enriched population are better, but for larger sizes the random population is already rich enough, that enriching it does not improve the learning process any further. 7 Mutations Mutation is a function that randomly changes one or more genes of a chromosome. It prevents the learning process from getting stuck at any local optima. Acyclic nature of our chromosomes is preserved easily in our mutation function. We mutate both the node order and the connectivity matrix of our chromosomes.

13 We toggle the bits of the connectivity matrix based on a mutation probability. Our mutation operation for node order involves swapping two nodes randomly based on the mutation probability. 8 Results Given a Bayesian network (network structure + probability tables), the Logic Sampling method is used to generate a data set [7]. The Acyclic Bayesian Net Generator is used to learn the Bayesian network structure. For the learning process, we generate an enriched population of size p as described in section 6. In each iteration of the genetic algorithm, we produce p daughter chromosomes, and discard the worst p chromosomes (chromosomes with the lowest score) among the union of daughters and parents. The logarithm of Bayesian Dirichlet is used as the scoring function [, ]. The best network learned is compared with the optimal network. We test our algorithm on Bayesian networks of different sizes, including the ASIA (an 8 node network) [], the ALARM (a 7 node network) [], a network with 6 nodes and a node subset of ALARM. Sometimes we learn a network that has a better score than that of the original network. The term optimal network denotes the original network, unless a better network is learned over the span of all the tests performed. In our tests, we are able to learn the optimal structure for all networks, except for one case, the ALARM network. In the case of the ALARM network, we are able to learn a nearly optimal solution. (The network we learn has a score value of -, whereas the score of the original ALARM is -.) We test our algorithm by varying population size of the genetic algorithms. As the population size increases, the probability that the optimal structure is learned approaches a value of. 9 Conclusions We have presented the Acyclic Bayesian Net Generator, a new technique to learn the Bayesian network structure using genetic algorithms. Acyclic nature is inherent to the encoding scheme and is easily preserved during the learning process. In this technique, the genetic algorithm operations such as crossover and mutation are closed operations. In our method, throwing out cycle forming arcs, instead of reversing them, causes the algorithm to converge slower and produces a poorer fit Bayesian network. We hypothesize that other methods suffer when arcs are thrown out. In the Acyclic Bayesian Net Generator, we do not assume any predefined variable ordering for the structure learning. Our algorithm learns the optimal node ordering and the optimal network simultaneously. This is better than a two step approach in which the first step learns an optimal node ordering and the second step learns the actual structure.

14 Our crossover function is O(n ) and is in accordance with previous attempts. Tests show that our algorithm works well when applied to networks like ASIA and ALARM. We have also presented a technique of choosing an initial population in a semirandom manner such that it is rich enough to ensure that every chromosome in the solution space is reachable from our initial population. Results show that we are able to effectively learn Bayesian networks when this method is used. References. I. A. Beinlinch, H. J. Suermondt, R. M. Chavez and G. F. Cooper (989). The ALARM monitoring system: A case study with two probabilistic inference probabilistic inference techniques for belief networks. Proceedings of the Second European Conference on Artificial Intelligence in Medicine.. D. M. Chickering, D. Geiger, and D. Heckerman (99). Learning Bayesian networks is NP-hard. Technical Report MSR-TR-9-7, Microsoft Research.. G. F. Cooper and E. Herskovits (99). A Bayesian Method for the Induction of Probabilistic Networks from Data. Machine Learning, 9():9-7.. H. Guo, B. B. Perry, J. A. Stilson, W. H. Hsu (). A Genetic Algorithm for Tuning Variable ings in Bayesian Network Structure Learning. Student Abstract, AAAI-.. D. Heckerman (99). A tutorial on Learning Bayesian Networks. Technical Report, MSR-TR-9-6. Microsoft Research. 6. D. Heckerman, D. Geiger, and D. Chickering (99). Learning Bayesian networks: The combination of knowledge and statistical data. Machine Learning, (): M. Henrion. Propagating uncertainty in Bayesian networks by probabilistic logic sampling (988). Uncertainty in Artificial Intelligence, pages 9-6, New York, N. Y. Elsevier Science Publishing Company, Inc. 8. P. Larranaga, M. Poza, Y. Yurramendi, R. H. Murga, C. M. H. Kuijpers (996a). Structure Learning of Bayesian Networks by Genetic Algorithms: A Performance Analysis of Control Parameters. IEEE Journal on Pattern Analysis and Machine Intelligence, 8(9): P. Larranaga, R. Murga, M. Poza, C. Kuijpers (996b). Structure Learning of Bayesian Networks by Hybrid Genetic Algorithms. In D. Fisher and H. J. Lenz. (eds.), Learning from Data: Artificial Intelligence and Statistics V, Lecture Notes in Statistics, New York, NY:Springer-Verlag:6-7.. S. L. Lauritzen, and D. J. Spiegelhalter (988). Local Computations with Probabilities on Graphical Structures and Their Application on Expert Systems. Journal of Royal Statistical Society, ():7-.. J. W. Myers, K. B. Laskey, K. A. DeJong (999). Learning Bayesian Networks from Incomplete Data using Evolutionary Algorithms. Proceedings of the Genetic and Evolutionary Computation Conference.. H. Steck. On the Use of Skeletons when Learning in Bayesian Networks (). Sixteenth Conference of Artificial Intelligence, UAI-. D. Whitley. A genetic algorithm tutorial (99). Technical Report, CS-9-, Department of Computer Science, Colorado State University.

Research Article Structural Learning about Directed Acyclic Graphs from Multiple Databases

Research Article Structural Learning about Directed Acyclic Graphs from Multiple Databases Abstract and Applied Analysis Volume 2012, Article ID 579543, 9 pages doi:10.1155/2012/579543 Research Article Structural Learning about Directed Acyclic Graphs from Multiple Databases Qiang Zhao School

More information

Parameter Control of Genetic Algorithms by Learning and Simulation of Bayesian Networks

Parameter Control of Genetic Algorithms by Learning and Simulation of Bayesian Networks Submitted Soft Computing Parameter Control of Genetic Algorithms by Learning and Simulation of Bayesian Networks C. Bielza,*, J.A. Fernández del Pozo, P. Larrañaga Universidad Politécnica de Madrid, Departamento

More information

Local Search Methods for Learning Bayesian Networks Using a Modified Neighborhood in the Space of DAGs

Local Search Methods for Learning Bayesian Networks Using a Modified Neighborhood in the Space of DAGs Local Search Methods for Learning Bayesian Networks Using a Modified Neighborhood in the Space of DAGs L.M. de Campos 1, J.M. Fernández-Luna 2, and J.M. Puerta 3 1 Dpto. de Ciencias de la Computación e

More information

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search A JOB-SHOP SCHEDULING PROBLEM (JSSP) USING GENETIC ALGORITHM (GA) Mahanim Omar, Adam Baharum, Yahya Abu Hasan School of Mathematical Sciences, Universiti Sains Malaysia 11800 Penang, Malaysia Tel: (+)

More information

International Journal of Digital Application & Contemporary research Website: (Volume 1, Issue 7, February 2013)

International Journal of Digital Application & Contemporary research Website:   (Volume 1, Issue 7, February 2013) Performance Analysis of GA and PSO over Economic Load Dispatch Problem Sakshi Rajpoot sakshirajpoot1988@gmail.com Dr. Sandeep Bhongade sandeepbhongade@rediffmail.com Abstract Economic Load dispatch problem

More information

Structure Learning of Bayesian Networks Using a Semantic Genetic Algorithm-Based Approach

Structure Learning of Bayesian Networks Using a Semantic Genetic Algorithm-Based Approach Structure Learning of Bayesian Networks Using a Semantic Genetic Algorithm-Based Approach Sachin Shetty Department of Electrical and Computer Engineering Old Dominion University Norfolk, VA 23529, USA

More information

Neural Network Weight Selection Using Genetic Algorithms

Neural Network Weight Selection Using Genetic Algorithms Neural Network Weight Selection Using Genetic Algorithms David Montana presented by: Carl Fink, Hongyi Chen, Jack Cheng, Xinglong Li, Bruce Lin, Chongjie Zhang April 12, 2005 1 Neural Networks Neural networks

More information

Av. Prof. Mello Moraes, 2231, , São Paulo, SP - Brazil

Av. Prof. Mello Moraes, 2231, , São Paulo, SP - Brazil " Generalizing Variable Elimination in Bayesian Networks FABIO GAGLIARDI COZMAN Escola Politécnica, University of São Paulo Av Prof Mello Moraes, 31, 05508-900, São Paulo, SP - Brazil fgcozman@uspbr Abstract

More information

Genetic Algorithms. Kang Zheng Karl Schober

Genetic Algorithms. Kang Zheng Karl Schober Genetic Algorithms Kang Zheng Karl Schober Genetic algorithm What is Genetic algorithm? A genetic algorithm (or GA) is a search technique used in computing to find true or approximate solutions to optimization

More information

C 1 Modified Genetic Algorithm to Solve Time-varying Lot Sizes Economic Lot Scheduling Problem

C 1 Modified Genetic Algorithm to Solve Time-varying Lot Sizes Economic Lot Scheduling Problem C 1 Modified Genetic Algorithm to Solve Time-varying Lot Sizes Economic Lot Scheduling Problem Bethany Elvira 1, Yudi Satria 2, dan Rahmi Rusin 3 1 Student in Department of Mathematics, University of Indonesia,

More information

Adaptive Crossover in Genetic Algorithms Using Statistics Mechanism

Adaptive Crossover in Genetic Algorithms Using Statistics Mechanism in Artificial Life VIII, Standish, Abbass, Bedau (eds)(mit Press) 2002. pp 182 185 1 Adaptive Crossover in Genetic Algorithms Using Statistics Mechanism Shengxiang Yang Department of Mathematics and Computer

More information

An Information Theory based Approach to Structure Learning in Bayesian Networks

An Information Theory based Approach to Structure Learning in Bayesian Networks An Information Theory based Approach to Structure Learning in Bayesian Networks Gopalakrishna Anantha 9 th October 2006 Committee Dr.Xue wen Chen (Chair) Dr. John Gauch Dr. Victor Frost Publications An

More information

Using Genetic Algorithm with Triple Crossover to Solve Travelling Salesman Problem

Using Genetic Algorithm with Triple Crossover to Solve Travelling Salesman Problem Proc. 1 st International Conference on Machine Learning and Data Engineering (icmlde2017) 20-22 Nov 2017, Sydney, Australia ISBN: 978-0-6480147-3-7 Using Genetic Algorithm with Triple Crossover to Solve

More information

Mutations for Permutations

Mutations for Permutations Mutations for Permutations Insert mutation: Pick two allele values at random Move the second to follow the first, shifting the rest along to accommodate Note: this preserves most of the order and adjacency

More information

An Evolutionary Algorithm for the Multi-objective Shortest Path Problem

An Evolutionary Algorithm for the Multi-objective Shortest Path Problem An Evolutionary Algorithm for the Multi-objective Shortest Path Problem Fangguo He Huan Qi Qiong Fan Institute of Systems Engineering, Huazhong University of Science & Technology, Wuhan 430074, P. R. China

More information

Distributed minimum spanning tree problem

Distributed minimum spanning tree problem Distributed minimum spanning tree problem Juho-Kustaa Kangas 24th November 2012 Abstract Given a connected weighted undirected graph, the minimum spanning tree problem asks for a spanning subtree with

More information

The strong chromatic number of a graph

The strong chromatic number of a graph The strong chromatic number of a graph Noga Alon Abstract It is shown that there is an absolute constant c with the following property: For any two graphs G 1 = (V, E 1 ) and G 2 = (V, E 2 ) on the same

More information

1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra

1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra Pattern Recall Analysis of the Hopfield Neural Network with a Genetic Algorithm Susmita Mohapatra Department of Computer Science, Utkal University, India Abstract: This paper is focused on the implementation

More information

MODELLING DOCUMENT CATEGORIES BY EVOLUTIONARY LEARNING OF TEXT CENTROIDS

MODELLING DOCUMENT CATEGORIES BY EVOLUTIONARY LEARNING OF TEXT CENTROIDS MODELLING DOCUMENT CATEGORIES BY EVOLUTIONARY LEARNING OF TEXT CENTROIDS J.I. Serrano M.D. Del Castillo Instituto de Automática Industrial CSIC. Ctra. Campo Real km.0 200. La Poveda. Arganda del Rey. 28500

More information

A note on the pairwise Markov condition in directed Markov fields

A note on the pairwise Markov condition in directed Markov fields TECHNICAL REPORT R-392 April 2012 A note on the pairwise Markov condition in directed Markov fields Judea Pearl University of California, Los Angeles Computer Science Department Los Angeles, CA, 90095-1596,

More information

Evolving SQL Queries for Data Mining

Evolving SQL Queries for Data Mining Evolving SQL Queries for Data Mining Majid Salim and Xin Yao School of Computer Science, The University of Birmingham Edgbaston, Birmingham B15 2TT, UK {msc30mms,x.yao}@cs.bham.ac.uk Abstract. This paper

More information

Lecture notes on the simplex method September We will present an algorithm to solve linear programs of the form. maximize.

Lecture notes on the simplex method September We will present an algorithm to solve linear programs of the form. maximize. Cornell University, Fall 2017 CS 6820: Algorithms Lecture notes on the simplex method September 2017 1 The Simplex Method We will present an algorithm to solve linear programs of the form maximize subject

More information

Summary: A Tutorial on Learning With Bayesian Networks

Summary: A Tutorial on Learning With Bayesian Networks Summary: A Tutorial on Learning With Bayesian Networks Markus Kalisch May 5, 2006 We primarily summarize [4]. When we think that it is appropriate, we comment on additional facts and more recent developments.

More information

Ordering-Based Search: A Simple and Effective Algorithm for Learning Bayesian Networks

Ordering-Based Search: A Simple and Effective Algorithm for Learning Bayesian Networks Ordering-Based Search: A Simple and Effective Algorithm for Learning Bayesian Networks Marc Teyssier Computer Science Dept. Stanford University Stanford, CA 94305 Daphne Koller Computer Science Dept. Stanford

More information

A step towards the Bermond-Thomassen conjecture about disjoint cycles in digraphs

A step towards the Bermond-Thomassen conjecture about disjoint cycles in digraphs A step towards the Bermond-Thomassen conjecture about disjoint cycles in digraphs Nicolas Lichiardopol Attila Pór Jean-Sébastien Sereni Abstract In 1981, Bermond and Thomassen conjectured that every digraph

More information

MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS

MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS In: Journal of Applied Statistical Science Volume 18, Number 3, pp. 1 7 ISSN: 1067-5817 c 2011 Nova Science Publishers, Inc. MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS Füsun Akman

More information

Dependency detection with Bayesian Networks

Dependency detection with Bayesian Networks Dependency detection with Bayesian Networks M V Vikhreva Faculty of Computational Mathematics and Cybernetics, Lomonosov Moscow State University, Leninskie Gory, Moscow, 119991 Supervisor: A G Dyakonov

More information

BI-OBJECTIVE EVOLUTIONARY ALGORITHM FOR FLEXIBLE JOB-SHOP SCHEDULING PROBLEM. Minimizing Make Span and the Total Workload of Machines

BI-OBJECTIVE EVOLUTIONARY ALGORITHM FOR FLEXIBLE JOB-SHOP SCHEDULING PROBLEM. Minimizing Make Span and the Total Workload of Machines International Journal of Mathematics and Computer Applications Research (IJMCAR) ISSN 2249-6955 Vol. 2 Issue 4 Dec - 2012 25-32 TJPRC Pvt. Ltd., BI-OBJECTIVE EVOLUTIONARY ALGORITHM FOR FLEXIBLE JOB-SHOP

More information

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION 131 CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION 6.1 INTRODUCTION The Orthogonal arrays are helpful in guiding the heuristic algorithms to obtain a good solution when applied to NP-hard problems. This

More information

Node Aggregation for Distributed Inference in Bayesian Networks

Node Aggregation for Distributed Inference in Bayesian Networks Node Aggregation for Distributed Inference in Bayesian Networks Kuo-Chu Chang and Robert Fung Advanced Decision Systmes 1500 Plymouth Street Mountain View, California 94043-1230 Abstract This study describes

More information

CPS 102: Discrete Mathematics. Quiz 3 Date: Wednesday November 30, Instructor: Bruce Maggs NAME: Prob # Score. Total 60

CPS 102: Discrete Mathematics. Quiz 3 Date: Wednesday November 30, Instructor: Bruce Maggs NAME: Prob # Score. Total 60 CPS 102: Discrete Mathematics Instructor: Bruce Maggs Quiz 3 Date: Wednesday November 30, 2011 NAME: Prob # Score Max Score 1 10 2 10 3 10 4 10 5 10 6 10 Total 60 1 Problem 1 [10 points] Find a minimum-cost

More information

Announcements. CS 188: Artificial Intelligence Fall Reminder: CSPs. Today. Example: 3-SAT. Example: Boolean Satisfiability.

Announcements. CS 188: Artificial Intelligence Fall Reminder: CSPs. Today. Example: 3-SAT. Example: Boolean Satisfiability. CS 188: Artificial Intelligence Fall 2008 Lecture 5: CSPs II 9/11/2008 Announcements Assignments: DUE W1: NOW P1: Due 9/12 at 11:59pm Assignments: UP W2: Up now P2: Up by weekend Dan Klein UC Berkeley

More information

CS 188: Artificial Intelligence Fall 2008

CS 188: Artificial Intelligence Fall 2008 CS 188: Artificial Intelligence Fall 2008 Lecture 5: CSPs II 9/11/2008 Dan Klein UC Berkeley Many slides over the course adapted from either Stuart Russell or Andrew Moore 1 1 Assignments: DUE Announcements

More information

CS2 Algorithms and Data Structures Note 10. Depth-First Search and Topological Sorting

CS2 Algorithms and Data Structures Note 10. Depth-First Search and Topological Sorting CS2 Algorithms and Data Structures Note 10 Depth-First Search and Topological Sorting In this lecture, we will analyse the running time of DFS and discuss a few applications. 10.1 A recursive implementation

More information

A CSP Search Algorithm with Reduced Branching Factor

A CSP Search Algorithm with Reduced Branching Factor A CSP Search Algorithm with Reduced Branching Factor Igor Razgon and Amnon Meisels Department of Computer Science, Ben-Gurion University of the Negev, Beer-Sheva, 84-105, Israel {irazgon,am}@cs.bgu.ac.il

More information

On Local Optima in Learning Bayesian Networks

On Local Optima in Learning Bayesian Networks On Local Optima in Learning Bayesian Networks Jens D. Nielsen, Tomáš Kočka and Jose M. Peña Department of Computer Science Aalborg University, Denmark {dalgaard, kocka, jmp}@cs.auc.dk Abstract This paper

More information

Bi-Objective Optimization for Scheduling in Heterogeneous Computing Systems

Bi-Objective Optimization for Scheduling in Heterogeneous Computing Systems Bi-Objective Optimization for Scheduling in Heterogeneous Computing Systems Tony Maciejewski, Kyle Tarplee, Ryan Friese, and Howard Jay Siegel Department of Electrical and Computer Engineering Colorado

More information

An Effective Upperbound on Treewidth Using Partial Fill-in of Separators

An Effective Upperbound on Treewidth Using Partial Fill-in of Separators An Effective Upperbound on Treewidth Using Partial Fill-in of Separators Boi Faltings Martin Charles Golumbic June 28, 2009 Abstract Partitioning a graph using graph separators, and particularly clique

More information

Probabilistic Abstraction Lattices: A Computationally Efficient Model for Conditional Probability Estimation

Probabilistic Abstraction Lattices: A Computationally Efficient Model for Conditional Probability Estimation Probabilistic Abstraction Lattices: A Computationally Efficient Model for Conditional Probability Estimation Daniel Lowd January 14, 2004 1 Introduction Probabilistic models have shown increasing popularity

More information

An algorithm for Performance Analysis of Single-Source Acyclic graphs

An algorithm for Performance Analysis of Single-Source Acyclic graphs An algorithm for Performance Analysis of Single-Source Acyclic graphs Gabriele Mencagli September 26, 2011 In this document we face with the problem of exploiting the performance analysis of acyclic graphs

More information

Learning Bayesian Networks from Incomplete Data using Evolutionary Algorithms

Learning Bayesian Networks from Incomplete Data using Evolutionary Algorithms Learning ayesian Networks from Incomplete ata using Evolutionary lgorithms James W. Myers George Mason University Fairfax, V 22030 Kathryn. Laskey George Mason University Fairfax, V 22030 Kenneth. ejong

More information

Ordering attributes for missing values prediction and data classification

Ordering attributes for missing values prediction and data classification Ordering attributes for missing values prediction and data classification E. R. Hruschka Jr., N. F. F. Ebecken COPPE /Federal University of Rio de Janeiro, Brazil. Abstract This work shows the application

More information

Lecture 15: The subspace topology, Closed sets

Lecture 15: The subspace topology, Closed sets Lecture 15: The subspace topology, Closed sets 1 The Subspace Topology Definition 1.1. Let (X, T) be a topological space with topology T. subset of X, the collection If Y is a T Y = {Y U U T} is a topology

More information

A Well-Behaved Algorithm for Simulating Dependence Structures of Bayesian Networks

A Well-Behaved Algorithm for Simulating Dependence Structures of Bayesian Networks A Well-Behaved Algorithm for Simulating Dependence Structures of Bayesian Networks Yang Xiang and Tristan Miller Department of Computer Science University of Regina Regina, Saskatchewan, Canada S4S 0A2

More information

Escola Politécnica, University of São Paulo Av. Prof. Mello Moraes, 2231, , São Paulo, SP - Brazil

Escola Politécnica, University of São Paulo Av. Prof. Mello Moraes, 2231, , São Paulo, SP - Brazil Generalizing Variable Elimination in Bayesian Networks FABIO GAGLIARDI COZMAN Escola Politécnica, University of São Paulo Av. Prof. Mello Moraes, 2231, 05508-900, São Paulo, SP - Brazil fgcozman@usp.br

More information

Escaping Local Optima: Genetic Algorithm

Escaping Local Optima: Genetic Algorithm Artificial Intelligence Escaping Local Optima: Genetic Algorithm Dae-Won Kim School of Computer Science & Engineering Chung-Ang University We re trying to escape local optima To achieve this, we have learned

More information

Fuzzy Inspired Hybrid Genetic Approach to Optimize Travelling Salesman Problem

Fuzzy Inspired Hybrid Genetic Approach to Optimize Travelling Salesman Problem Fuzzy Inspired Hybrid Genetic Approach to Optimize Travelling Salesman Problem Bindu Student, JMIT Radaur binduaahuja@gmail.com Mrs. Pinki Tanwar Asstt. Prof, CSE, JMIT Radaur pinki.tanwar@gmail.com Abstract

More information

A Comparison of Structural Distance Measures for Causal Bayesian Network Models

A Comparison of Structural Distance Measures for Causal Bayesian Network Models Recent Advances in Intelligent Information Systems ISBN 978-83-60434-59-8, pages 443 456 A Comparison of Structural Distance Measures for Causal Bayesian Network Models Martijn de Jongh 1 and Marek J.

More information

CS 188: Artificial Intelligence Spring Today

CS 188: Artificial Intelligence Spring Today CS 188: Artificial Intelligence Spring 2006 Lecture 7: CSPs II 2/7/2006 Dan Klein UC Berkeley Many slides from either Stuart Russell or Andrew Moore Today More CSPs Applications Tree Algorithms Cutset

More information

EVOLVING LEGO. Exploring the impact of alternative encodings on the performance of evolutionary algorithms. 1. Introduction

EVOLVING LEGO. Exploring the impact of alternative encodings on the performance of evolutionary algorithms. 1. Introduction N. Gu, S. Watanabe, H. Erhan, M. Hank Haeusler, W. Huang, R. Sosa (eds.), Rethinking Comprehensive Design: Speculative Counterculture, Proceedings of the 19th International Conference on Computer- Aided

More information

PROOF OF THE COLLATZ CONJECTURE KURMET SULTAN. Almaty, Kazakhstan. ORCID ACKNOWLEDGMENTS

PROOF OF THE COLLATZ CONJECTURE KURMET SULTAN. Almaty, Kazakhstan.   ORCID ACKNOWLEDGMENTS PROOF OF THE COLLATZ CONJECTURE KURMET SULTAN Almaty, Kazakhstan E-mail: kurmet.sultan@gmail.com ORCID 0000-0002-7852-8994 ACKNOWLEDGMENTS 2 ABSTRACT This article contains a proof of the Collatz conjecture.

More information

Heuristic Optimisation

Heuristic Optimisation Heuristic Optimisation Revision Lecture Sándor Zoltán Németh http://web.mat.bham.ac.uk/s.z.nemeth s.nemeth@bham.ac.uk University of Birmingham S Z Németh (s.nemeth@bham.ac.uk) Heuristic Optimisation University

More information

6. Lecture notes on matroid intersection

6. Lecture notes on matroid intersection Massachusetts Institute of Technology 18.453: Combinatorial Optimization Michel X. Goemans May 2, 2017 6. Lecture notes on matroid intersection One nice feature about matroids is that a simple greedy algorithm

More information

One-Point Geometric Crossover

One-Point Geometric Crossover One-Point Geometric Crossover Alberto Moraglio School of Computing and Center for Reasoning, University of Kent, Canterbury, UK A.Moraglio@kent.ac.uk Abstract. Uniform crossover for binary strings has

More information

Genetic algorithms for the synthesis optimization of a set of irredundant diagnostic tests in the intelligent system

Genetic algorithms for the synthesis optimization of a set of irredundant diagnostic tests in the intelligent system Computer Science Journal of Moldova, vol.9, no.3(27), 2001 Genetic algorithms for the synthesis optimization of a set of irredundant diagnostic tests in the intelligent system Anna E. Yankovskaya Alex

More information

DETERMINING MAXIMUM/MINIMUM VALUES FOR TWO- DIMENTIONAL MATHMATICLE FUNCTIONS USING RANDOM CREOSSOVER TECHNIQUES

DETERMINING MAXIMUM/MINIMUM VALUES FOR TWO- DIMENTIONAL MATHMATICLE FUNCTIONS USING RANDOM CREOSSOVER TECHNIQUES DETERMINING MAXIMUM/MINIMUM VALUES FOR TWO- DIMENTIONAL MATHMATICLE FUNCTIONS USING RANDOM CREOSSOVER TECHNIQUES SHIHADEH ALQRAINY. Department of Software Engineering, Albalqa Applied University. E-mail:

More information

Mathematically Rigorous Software Design Review of mathematical prerequisites

Mathematically Rigorous Software Design Review of mathematical prerequisites Mathematically Rigorous Software Design 2002 September 27 Part 1: Boolean algebra 1. Define the Boolean functions and, or, not, implication ( ), equivalence ( ) and equals (=) by truth tables. 2. In an

More information

Evaluating the Effect of Perturbations in Reconstructing Network Topologies

Evaluating the Effect of Perturbations in Reconstructing Network Topologies DSC 2 Working Papers (Draft Versions) http://www.ci.tuwien.ac.at/conferences/dsc-2/ Evaluating the Effect of Perturbations in Reconstructing Network Topologies Florian Markowetz and Rainer Spang Max-Planck-Institute

More information

Genetic Algorithm for Circuit Partitioning

Genetic Algorithm for Circuit Partitioning Genetic Algorithm for Circuit Partitioning ZOLTAN BARUCH, OCTAVIAN CREŢ, KALMAN PUSZTAI Computer Science Department, Technical University of Cluj-Napoca, 26, Bariţiu St., 3400 Cluj-Napoca, Romania {Zoltan.Baruch,

More information

Efficient Prefix Computation on Faulty Hypercubes

Efficient Prefix Computation on Faulty Hypercubes JOURNAL OF INFORMATION SCIENCE AND ENGINEERING 17, 1-21 (21) Efficient Prefix Computation on Faulty Hypercubes YU-WEI CHEN AND KUO-LIANG CHUNG + Department of Computer and Information Science Aletheia

More information

CS5401 FS2015 Exam 1 Key

CS5401 FS2015 Exam 1 Key CS5401 FS2015 Exam 1 Key This is a closed-book, closed-notes exam. The only items you are allowed to use are writing implements. Mark each sheet of paper you use with your name and the string cs5401fs2015

More information

AXIOMS FOR THE INTEGERS

AXIOMS FOR THE INTEGERS AXIOMS FOR THE INTEGERS BRIAN OSSERMAN We describe the set of axioms for the integers which we will use in the class. The axioms are almost the same as what is presented in Appendix A of the textbook,

More information

On the Relationships between Zero Forcing Numbers and Certain Graph Coverings

On the Relationships between Zero Forcing Numbers and Certain Graph Coverings On the Relationships between Zero Forcing Numbers and Certain Graph Coverings Fatemeh Alinaghipour Taklimi, Shaun Fallat 1,, Karen Meagher 2 Department of Mathematics and Statistics, University of Regina,

More information

CS 188: Artificial Intelligence

CS 188: Artificial Intelligence CS 188: Artificial Intelligence CSPs II + Local Search Prof. Scott Niekum The University of Texas at Austin [These slides based on those of Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley.

More information

Midterm 2 Solutions. CS70 Discrete Mathematics and Probability Theory, Spring 2009

Midterm 2 Solutions. CS70 Discrete Mathematics and Probability Theory, Spring 2009 CS70 Discrete Mathematics and Probability Theory, Spring 2009 Midterm 2 Solutions Note: These solutions are not necessarily model answers. Rather, they are designed to be tutorial in nature, and sometimes

More information

On The Complexity of Virtual Topology Design for Multicasting in WDM Trees with Tap-and-Continue and Multicast-Capable Switches

On The Complexity of Virtual Topology Design for Multicasting in WDM Trees with Tap-and-Continue and Multicast-Capable Switches On The Complexity of Virtual Topology Design for Multicasting in WDM Trees with Tap-and-Continue and Multicast-Capable Switches E. Miller R. Libeskind-Hadas D. Barnard W. Chang K. Dresner W. M. Turner

More information

Literature Review On Implementing Binary Knapsack problem

Literature Review On Implementing Binary Knapsack problem Literature Review On Implementing Binary Knapsack problem Ms. Niyati Raj, Prof. Jahnavi Vitthalpura PG student Department of Information Technology, L.D. College of Engineering, Ahmedabad, India Assistant

More information

A Parallel Evolutionary Algorithm for Discovery of Decision Rules

A Parallel Evolutionary Algorithm for Discovery of Decision Rules A Parallel Evolutionary Algorithm for Discovery of Decision Rules Wojciech Kwedlo Faculty of Computer Science Technical University of Bia lystok Wiejska 45a, 15-351 Bia lystok, Poland wkwedlo@ii.pb.bialystok.pl

More information

Lecture 2 - Graph Theory Fundamentals - Reachability and Exploration 1

Lecture 2 - Graph Theory Fundamentals - Reachability and Exploration 1 CME 305: Discrete Mathematics and Algorithms Instructor: Professor Aaron Sidford (sidford@stanford.edu) January 11, 2018 Lecture 2 - Graph Theory Fundamentals - Reachability and Exploration 1 In this lecture

More information

Inducing Parameters of a Decision Tree for Expert System Shell McESE by Genetic Algorithm

Inducing Parameters of a Decision Tree for Expert System Shell McESE by Genetic Algorithm Inducing Parameters of a Decision Tree for Expert System Shell McESE by Genetic Algorithm I. Bruha and F. Franek Dept of Computing & Software, McMaster University Hamilton, Ont., Canada, L8S4K1 Email:

More information

Sparse Hypercube 3-Spanners

Sparse Hypercube 3-Spanners Sparse Hypercube 3-Spanners W. Duckworth and M. Zito Department of Mathematics and Statistics, University of Melbourne, Parkville, Victoria 3052, Australia Department of Computer Science, University of

More information

GENETIC ALGORITHM with Hands-On exercise

GENETIC ALGORITHM with Hands-On exercise GENETIC ALGORITHM with Hands-On exercise Adopted From Lecture by Michael Negnevitsky, Electrical Engineering & Computer Science University of Tasmania 1 Objective To understand the processes ie. GAs Basic

More information

Binary Representations of Integers and the Performance of Selectorecombinative Genetic Algorithms

Binary Representations of Integers and the Performance of Selectorecombinative Genetic Algorithms Binary Representations of Integers and the Performance of Selectorecombinative Genetic Algorithms Franz Rothlauf Department of Information Systems University of Bayreuth, Germany franz.rothlauf@uni-bayreuth.de

More information

arxiv:cs/ v1 [cs.ne] 15 Feb 2004

arxiv:cs/ v1 [cs.ne] 15 Feb 2004 Parameter-less Hierarchical BOA Martin Pelikan and Tz-Kai Lin arxiv:cs/0402031v1 [cs.ne] 15 Feb 2004 Dept. of Math. and Computer Science, 320 CCB University of Missouri at St. Louis 8001 Natural Bridge

More information

The Genetic Algorithm for finding the maxima of single-variable functions

The Genetic Algorithm for finding the maxima of single-variable functions Research Inventy: International Journal Of Engineering And Science Vol.4, Issue 3(March 2014), PP 46-54 Issn (e): 2278-4721, Issn (p):2319-6483, www.researchinventy.com The Genetic Algorithm for finding

More information

A Technique for Design Patterns Detection

A Technique for Design Patterns Detection A Technique for Design Patterns Detection Manjari Gupta Department of computer science Institute of Science Banaras Hindu University Varansi-221005, India manjari_gupta@rediffmail.com Abstract Several

More information

Evolutionary Computation Part 2

Evolutionary Computation Part 2 Evolutionary Computation Part 2 CS454, Autumn 2017 Shin Yoo (with some slides borrowed from Seongmin Lee @ COINSE) Crossover Operators Offsprings inherit genes from their parents, but not in identical

More information

Robust Signal-Structure Reconstruction

Robust Signal-Structure Reconstruction Robust Signal-Structure Reconstruction V. Chetty 1, D. Hayden 2, J. Gonçalves 2, and S. Warnick 1 1 Information and Decision Algorithms Laboratories, Brigham Young University 2 Control Group, Department

More information

CS 188: Artificial Intelligence

CS 188: Artificial Intelligence CS 188: Artificial Intelligence Constraint Satisfaction Problems II Instructors: Dan Klein and Pieter Abbeel University of California, Berkeley [These slides were created by Dan Klein and Pieter Abbeel

More information

Algebra of Sets (Mathematics & Logic A)

Algebra of Sets (Mathematics & Logic A) Algebra of Sets (Mathematics & Logic A) RWK/MRQ October 28, 2002 Note. These notes are adapted (with thanks) from notes given last year by my colleague Dr Martyn Quick. Please feel free to ask me (not

More information

Bayesian Networks. A Bayesian network is a directed acyclic graph that represents causal relationships between random variables. Earthquake.

Bayesian Networks. A Bayesian network is a directed acyclic graph that represents causal relationships between random variables. Earthquake. Bayes Nets Independence With joint probability distributions we can compute many useful things, but working with joint PD's is often intractable. The naïve Bayes' approach represents one (boneheaded?)

More information

Inferring Regulatory Networks by Combining Perturbation Screens and Steady State Gene Expression Profiles

Inferring Regulatory Networks by Combining Perturbation Screens and Steady State Gene Expression Profiles Supporting Information to Inferring Regulatory Networks by Combining Perturbation Screens and Steady State Gene Expression Profiles Ali Shojaie,#, Alexandra Jauhiainen 2,#, Michael Kallitsis 3,#, George

More information

Preprocessing of Stream Data using Attribute Selection based on Survival of the Fittest

Preprocessing of Stream Data using Attribute Selection based on Survival of the Fittest Preprocessing of Stream Data using Attribute Selection based on Survival of the Fittest Bhakti V. Gavali 1, Prof. Vivekanand Reddy 2 1 Department of Computer Science and Engineering, Visvesvaraya Technological

More information

Lecture 19. Broadcast routing

Lecture 19. Broadcast routing Lecture 9 Broadcast routing Slide Broadcast Routing Route a packet from a source to all nodes in the network Possible solutions: Flooding: Each node sends packet on all outgoing links Discard packets received

More information

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014 Suggested Reading: Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 Probabilistic Modelling and Reasoning: The Junction

More information

Network Routing Protocol using Genetic Algorithms

Network Routing Protocol using Genetic Algorithms International Journal of Electrical & Computer Sciences IJECS-IJENS Vol:0 No:02 40 Network Routing Protocol using Genetic Algorithms Gihan Nagib and Wahied G. Ali Abstract This paper aims to develop a

More information

A Genetic Algorithm for Multiprocessor Task Scheduling

A Genetic Algorithm for Multiprocessor Task Scheduling A Genetic Algorithm for Multiprocessor Task Scheduling Tashniba Kaiser, Olawale Jegede, Ken Ferens, Douglas Buchanan Dept. of Electrical and Computer Engineering, University of Manitoba, Winnipeg, MB,

More information

These notes present some properties of chordal graphs, a set of undirected graphs that are important for undirected graphical models.

These notes present some properties of chordal graphs, a set of undirected graphs that are important for undirected graphical models. Undirected Graphical Models: Chordal Graphs, Decomposable Graphs, Junction Trees, and Factorizations Peter Bartlett. October 2003. These notes present some properties of chordal graphs, a set of undirected

More information

Learning Directed Probabilistic Logical Models using Ordering-search

Learning Directed Probabilistic Logical Models using Ordering-search Learning Directed Probabilistic Logical Models using Ordering-search Daan Fierens, Jan Ramon, Maurice Bruynooghe, and Hendrik Blockeel K.U.Leuven, Dept. of Computer Science, Celestijnenlaan 200A, 3001

More information

Discrete mathematics

Discrete mathematics Discrete mathematics Petr Kovář petr.kovar@vsb.cz VŠB Technical University of Ostrava DiM 470-2301/02, Winter term 2018/2019 About this file This file is meant to be a guideline for the lecturer. Many

More information

AN EVOLUTIONARY APPROACH TO DISTANCE VECTOR ROUTING

AN EVOLUTIONARY APPROACH TO DISTANCE VECTOR ROUTING International Journal of Latest Research in Science and Technology Volume 3, Issue 3: Page No. 201-205, May-June 2014 http://www.mnkjournals.com/ijlrst.htm ISSN (Online):2278-5299 AN EVOLUTIONARY APPROACH

More information

Lecture 8: Genetic Algorithms

Lecture 8: Genetic Algorithms Lecture 8: Genetic Algorithms Cognitive Systems - Machine Learning Part II: Special Aspects of Concept Learning Genetic Algorithms, Genetic Programming, Models of Evolution last change December 1, 2010

More information

ABSTRACT I. INTRODUCTION. J Kanimozhi *, R Subramanian Department of Computer Science, Pondicherry University, Puducherry, Tamil Nadu, India

ABSTRACT I. INTRODUCTION. J Kanimozhi *, R Subramanian Department of Computer Science, Pondicherry University, Puducherry, Tamil Nadu, India ABSTRACT 2018 IJSRSET Volume 4 Issue 4 Print ISSN: 2395-1990 Online ISSN : 2394-4099 Themed Section : Engineering and Technology Travelling Salesman Problem Solved using Genetic Algorithm Combined Data

More information

Cleaning an Arbitrary Regular Network with Mobile Agents

Cleaning an Arbitrary Regular Network with Mobile Agents Cleaning an Arbitrary Regular Network with Mobile Agents Paola Flocchini, Amiya Nayak and Arno Schulz School of Information Technology and Engineering University of Ottawa 800 King Edward Avenue Ottawa,

More information

Outline. Motivation. Introduction of GAs. Genetic Algorithm 9/7/2017. Motivation Genetic algorithms An illustrative example Hypothesis space search

Outline. Motivation. Introduction of GAs. Genetic Algorithm 9/7/2017. Motivation Genetic algorithms An illustrative example Hypothesis space search Outline Genetic Algorithm Motivation Genetic algorithms An illustrative example Hypothesis space search Motivation Evolution is known to be a successful, robust method for adaptation within biological

More information

Structural Advantages for Ant Colony Optimisation Inherent in Permutation Scheduling Problems

Structural Advantages for Ant Colony Optimisation Inherent in Permutation Scheduling Problems Structural Advantages for Ant Colony Optimisation Inherent in Permutation Scheduling Problems James Montgomery No Institute Given Abstract. When using a constructive search algorithm, solutions to scheduling

More information

PACKING DIGRAPHS WITH DIRECTED CLOSED TRAILS

PACKING DIGRAPHS WITH DIRECTED CLOSED TRAILS PACKING DIGRAPHS WITH DIRECTED CLOSED TRAILS PAUL BALISTER Abstract It has been shown [Balister, 2001] that if n is odd and m 1,, m t are integers with m i 3 and t i=1 m i = E(K n) then K n can be decomposed

More information

Models and Algorithms for Shortest Paths in a Time Dependent Network

Models and Algorithms for Shortest Paths in a Time Dependent Network Models and Algorithms for Shortest Paths in a Time Dependent Network Yinzhen Li 1,2, Ruichun He 1 Zhongfu Zhang 1 Yaohuang Guo 2 1 Lanzhou Jiaotong University, Lanzhou 730070, P. R. China 2 Southwest Jiaotong

More information

Hyperplane Ranking in. Simple Genetic Algorithms. D. Whitley, K. Mathias, and L. Pyeatt. Department of Computer Science. Colorado State University

Hyperplane Ranking in. Simple Genetic Algorithms. D. Whitley, K. Mathias, and L. Pyeatt. Department of Computer Science. Colorado State University Hyperplane Ranking in Simple Genetic Algorithms D. Whitley, K. Mathias, and L. yeatt Department of Computer Science Colorado State University Fort Collins, Colorado 8523 USA whitley,mathiask,pyeatt@cs.colostate.edu

More information

CHAPTER 4 GENETIC ALGORITHM

CHAPTER 4 GENETIC ALGORITHM 69 CHAPTER 4 GENETIC ALGORITHM 4.1 INTRODUCTION Genetic Algorithms (GAs) were first proposed by John Holland (Holland 1975) whose ideas were applied and expanded on by Goldberg (Goldberg 1989). GAs is

More information