Research Article Hydrological Cycle Algorithm for Continuous Optimization Problems

Size: px
Start display at page:

Download "Research Article Hydrological Cycle Algorithm for Continuous Optimization Problems"

Transcription

1 Hinawi Journal of Optimization Volume 217, Article ID , 25 pages Research Article Hyrological Cycle Algorithm for Continuous Optimization Problems Ahma Weyan, Jacqueline Whalley, an Ajit Narayanan School of Engineering, Computer an Mathematical Sciences, Aucklan University of Technology, Aucklan, New Zealan Corresponence shoul be aresse to Ahma Weyan; Receive 2 August 217; Revise 13 November 217; Accepte 22 November 217; Publishe 17 December 217 Acaemic Eitor: Efren Mezura-Montes Copyright 217 Ahma Weyan et al. This is an open access article istribute uner the Creative Commons Attribution License, which permits unrestricte use, istribution, an reprouction in any meium, provie the original work is properly cite. A new nature-inspire optimization algorithm calle the Hyrological Cycle Algorithm (HCA) is propose base on the continuous movement of water in nature. In the HCA, a collection of water rops passes through various hyrological water cycle stages, such as flow, evaporation, conensation, an precipitation. Each stage plays an important role in generating solutions an avoiing premature convergence. The HCA shares information by irect an inirect communication among the water rops, which improves solution quality. Similarities an ifferences between HCA an other water-base algorithms are ientifie, an the implications of these ifferences on overall performance are iscusse. A new topological representation for problems with a continuous omain is propose. In proof-of-concept experiments, the HCA is applie on a variety of benchmarke continuous numerical functions. The results were foun to be competitive in comparison to a number of other algorithms an valiate the effectiveness of HCA. Also emonstrate is the ability of HCA to escape from local optima solutions an converge to global solutions. Thus, HCA provies an alternative approach to tackling various types of multimoal continuous optimization problems as well as an overall framework for water-base particle algorithms in general. 1. Introuction Optimization is the process of making something as goo as possible. In computer science an operations research, an optimization problem can be efine as the problem of fining the best solution or better than previously known best solution among a set of feasible solutions by trying ifferent variations of the input [1]. In mathematics an engineering, researchers use optimization algorithms to evelop an improve new ieas an moels that can be expresse as a function of certain variables. Optimization problems can becategorizeaseithercontinuousoriscrete,baseon the nature of the variables in the optimization function. In continuous optimization, each variable may have any value within a efine an specific range, an they are commonly real numbers. For iscrete optimization, each variable has a value from a finite set of possible values or a permutation of a set of numbers, commonly as integer numbers [2]. A variety of continuous optimization problems (COPs) exist in the literature for benchmarking purposes [3]. In these functions, the values for various continuous variables nee to be obtaine such that the function is either minimize or maximize. These functions have various structures an can be categorize as either unimoal or multimoal. A unimoal function has only one local optimum point, whereas a multimoal function has several local optimum points with one or more points as a global optimum. Each function may consist of a single variable or may be multivariate. Typically, a unimoal function has one variable, while a multimoal function has multiple variables. Mathematically, a continuous objective function can be represente as follows: minimize f (X) :R n R, (1) where X is a vector that consists of n ecision variables: X={x 1,x 2,...,x n } R. (2) Asolutionx is calle a global minimizer of a problem f(x), if f(x ) f(x)for all x X.Conversely,itiscalleaglobal maximizer if f(x ) f(x)for all x X. These functions are commonly use to evaluate the characteristics of new algorithms an to measure an algorithm s

2 2 Journal of Optimization overall performance, convergence rate, robustness, precision, an its ability to escape from local optima solutions [3]. These empirical results can be use to compare any new algorithm with existing optimization algorithms an to gain unerstaning of the algorithm s behavior on ifferent kins of problem. For instance, multimoal functions can be use to check the ability of an algorithm to escape local optima an explore new regions, especially when the global optimal is surroune by many local optima. Similarly, flat surface functions a more ifficulty to reaching global optima as the flatness oes not give any inication of the irection towars global optima [3]. Practically, the number of variables an the omain of each variable (the function imensionality) affect the complexity of the function. In aition, other factors such as the number of local/global optima points an the istribution of local optima compare with global optima are crucial issues in etermining the complexity of a function, especially when a global minimum point is locate very close to local minima points [3]. Some of these functions represent real-life optimization problems, while others are artificial problems. Furthermore, a function can be classifie as unconstraine or constraine. An unconstraine function has no restriction on the values of its variables. Due to the presence of a large variety of optimization problems, it is har to fin a suitable algorithm that can solve all types of optimization problems effectively. On the other han, the No-Free-Lunch (NFL) theorems posit that no particular algorithm performs better than all other algorithms for all problems an that what an algorithm gains in performance on some problems can be lost on other problems [5]. Consequently, we can infer that some algorithms are more applicable an compatible with particular classes of optimization problems than others. Therefore, there is always a nee to esign an evaluate new algorithms for their potential to improve performance on certain types of optimization problem. For these reasons, the main objective of this paper is to evelop a new optimization algorithm an etermine its performance on various benchmarke problems to gain unerstaning of the algorithm an evaluate its applicability to other unsolve problems. Many establishe nature-inspire algorithms, especially water-base algorithms, have proven successful in solving COPs. Nevertheless, these water-base algorithms o not take into account the broaer concepts of the hyrological water cycle, how the water rops an paths are forme, or the fact that water is a renewable an recyclable resource. That is, previous water-base algorithms partially utilize some stages of the natural water cycle an emonstrate only some of the important aspects of a full hyrological water cycle for solving COPs. The main contribution of this paper is to elve eeper into hyrological theory an the funamental concepts surrouning water roplets to inform the esign of a new optimization algorithm. This algorithm provies a new hyrological-base conceptual framework within which existing an future work in water-base algorithms can be locate. In this paper, we present a new nature-inspire optimization algorithm calle the Hyrological Cycle Algorithm (HCA) for solving optimization problems. HCA simulates nature s hyrological water cycle. More specifically, it involves a collection of water rops passing through ifferent phases such as flow (runoff), evaporation, conensation, an precipitation to generate a solution. It can be consiere as a swarm intelligence optimization algorithm for some parts of the cycle when a collection of water rops moves through the search space. Butitcanalsobeconsiereanevolutionaryalgorithm forotherpartsofthecyclewheninformationisexchange an share. By using the full hyrological water cycle as a conceptual framework, we show that previous water-base algorithms have preominantly only use swarm-like aspects inspire by precipitation an flow. HCA, however, uses all four stages that will form a complete water-base approach to solving optimization problems efficiently. In particular, we show that for certain problems HCA leas to improve performance an solution quality. In particular, we emonstrate how the conensation stage is utilize to allow information sharing among the particles. The information sharing helps in improving the performance an reucing the number of iterations neee to reach the optimal solution. Our claims for HCA are to be juge not on improve efficiency but on improve conceptual bioinspiration as well as contribution to the area of water-base algorithm. However, we also show that HCA is competitive in comparison to several other nature-inspire algorithms, both water-base an nonwater-base. In summary, the paper presents a new overall conceptual framework for water-base algorithms which applies the hyrological cycle in its entirety to continuous optimization problems as a proof-of-concept (i.e., valiates the full hyrological inspire framework). The benchmarke mathematical test functions chosen in this paper are useful for evaluating performance characteristics of any new approach, especially the effectiveness of exploration an exploitation processes. Solving these functions (i.e., fining the global-optimal solution) is challenging for most algorithms, even with a few variables, because their artificial lanscapes may contain many local optimal solutions. Our minimal criterion for success is that the HCA algorithm performance is comparable to other well-known algorithms incluing those that only partially aopt the full HCA framework. We maintain that our results are competitive an that therefore water-inspire researchers an non-water-inspire researchers lose nothing by aopting the richer bioinspiration that comes with HCA in its entirety rather than aopting only parts of the full water cycle. The HCA s performance is evaluate in terms of fining optimal solutions an computational effort (number of iterations). The algorithm s behavior, its convergence rate, an its ability to eal with multimoal functions are also investigate an analyze. The main aim of this paper is not to prove the superiority of HCA over other algorithms. Instea, the aim is to provie researchers in science, engineering, operations research, an machine learning with a new waterbase tool for ealing with continuous variables in convex, unconstraine, an nonlinear problems. Further etails an features of the HCA are escribe in Section 3. The remainer of this paper is organize as follows. Section 2 iscusses various relate swarm intelligence techniques

3 Journal of Optimization 3 use to tackle COPs an istinguishes HCA from relate algorithms. Section 3 escribes the hyrological cycle in nature an explains the propose HCA. Section 4 outlines the experimental setup, problem representation, results obtaine, an the comparative evaluation conucte with other algorithms. Finally, Section 5 presents concluing remarks an outlines plans for further research. 2. Previous Swarm Intelligence Approaches for Solving Continuous Problems Many metaheuristic optimization algorithms have been esigne an their performance has been evaluate on benchmark functions. Among these are nature-inspire algorithms, which have receive consierable attention. Each algorithm uses a ifferent approach an representation to solve the target problem. It is worth mentioning that our focus is on stuies that involve solving unconstraine continuous functions with few variables. If a new algorithm cannot eal with these cases, there is little chance of it ealing with more complex functions. Ant colony optimization (ACO) is one of the best-known techniques for solving iscrete optimization problems an has been extene to eal with COPs in many stuies. For example, in [6], the authors ivie the function omain into a certain number of regions representing the trial solution space for the ants to explore. Then they generate two types of ants (global an local ants) an istribute them to explore these regions. The global ants were responsible for upating the fitness of the regions globally, whereas the local ants i thesamelocally.theapproachincorporatemutationan crossover operations to improve the solution quality. Socha an Dorigo [7] extene the ACO algorithm to tackle continuous omain problems. The extension facilitate consieration of continuous probability instea of iscrete probability an incorporate the Gaussian probability ensity function to choose the next point. The pheromone evaporation proceure was moifie an ol solutions were remove from the caniate solutions pool an replace with new an better solutions. A weight associate with each ant solution represente the fitness of the solution. Other ACO inspire approaches with various implementations that o not follow the original ACO algorithm structure have also been evelope to solve COPs [8 11]. In further work, the performance of ACO was evaluate on continuous function optimization [12]. One of the problems with ACO is that reinforcement of paths is the only metho for sharing information between the ants in an inirect way. A heavily pheromone-reinforce path is naturally consiere a goo path for other ants to follow. No other mechanism exists to allow ants to share information for exploration purposes except inirectly through the evaporation of pheromone over time. Exploration iminishes over time with ACO, which suitsproblemswheretheremaybeonlyoneglobalsolution. Butiftheaimistofinanumberofalternativesolutions, the rate of evaporation has to be strictly controlle to ensure that no single path ominates. Evaporation control can lea to other sie-effects in the classic ACO paraigm, such as longer convergence time, resulting in the nee to a possibly nonintuitive information sharing mechanisms, such as global upates, for exploration purposes. Although strictly speaking they are not a swarm intelligence approach, genetic algorithms (GAs) [13] have been applie to many COPs. A GA uses simple operations such as crossover an mutation to manipulate an irect the population an to help escape from local optima. The values of the variables are encoe using binary or real numbers [14]. However, as with other algorithms, GA performance can epen critically on the initial population s values, which are usually ranom. This was clarifie by Maaranen et al. [15], who investigate the importance an effect of the initial population values on the quality of the final solution. In [16] the authorsalsofocuseonthechoiceoftheinitialpopulation values an moifie the algorithm to intensify the search in the most promising area of the solution space. The main problem with stanar GAs for optimization continues to be the lack of long-termism, where fitness for exploitation may nee to be seconary to allow for aequate exploration to assess the problem space for alternative solutions. One possible way to eal with this problem is to hybriize the GA with other approaches, such as in [17] where a hybri GA an particle swarm optimization (PSO) is propose for solving multimoal functions. James an Russell [18] examine the performance of the PSO algorithm on some continuous optimization functions. Later, researchers, such as Chen et al. [19], moifie the PSO algorithm to help overcome the problem of premature convergence. Aitionally, PSO is use for global optimization in [2]. The PSO is also hybriize with ACO to improve the performance when ealing with high imension multimoal functions [21]. Further, a comparative stuy between GA an PSO on numerical benchmark problems can be foun in [22]. Other versions of the PSO algorithm with crossover an mutation operators to improve the performance have also been propose, such as in [23]. Once such GA operators are introuce, the question arises as to what avantages a PSO approach has over a stanar evolutionary approach to the same problem. The bees algorithm (BA) has been use to solve a number of benchmark functions [24]. This algorithm simulates the behavior of swarms of honey bees uring foo foraging. In BA, scout bees are sent to search for foo sources by moving ranomly from one place to another. On returning to the hive, they share the information with the other bees about location, istance, an quality of the foo source. The algorithm prouces goo results for several benchmark functions. However, the information sharing is irect an confine to subsets of bees, leaing to the possibility of convergence to a local optimum or ivergence away from the global optimum. The grenae explosion metho (GSM) is an evolutionary algorithm which has been use to optimize real-value problems[25].thegsmisbaseonthemechanismassociate with a grenae explosion, which is intene to overcome problems associate with crowing where caniate solutions have converge to a local minimum. When a grenae exploes, shrapnel pieces are thrown to estroy objects in the vicinity of the explosion. In the GSM, losses are calculate

4 4 Journal of Optimization 1 2 i i+1 M P Figure 1: Representation of a continuous problem in the IWD algorithm. for each piece of shrapnel, an the effect of each piece of shrapnel is calculate. The highest loss effect represents valuable objects existing in that area. Then, another grenae is thrown to the area to cause greater loss, an so on. Overall, the GSM was able to fin the global minimum faster than other algorithms in seven out of eight benchmarke tests. However, the algorithm requires the maintenance of nonintuitive territory raius preventing shrapnel pieces from coming too close to each other an forcing shrapnel to sprea uniformly over the search space. Also require is an explosion range for the shrapnel. These control parameters work well with problem spaces containing many local optima so that global optima can be foun after suitable exploration. However, the relationship between these control parameters an problem spaces of unknown topology is not clear Water-Base Algorithms. Algorithms inspire by water flow have become increasingly common for tackling optimization problems. Two of the best-known examples are intelligent water rops (IWD) an the water cycle algorithm (WCA). These algorithms have been shown to be successful in many applications incluing various COPs. The IWD algorithm is a swarm-base algorithm inspire by the groun flow of water in nature an the actions an reactions that occur uring that process [26]. In IWD, a water rop has two properties: movement velocity an the amount of soil it carries. These properties are change an upate for each water rop throughout its movement from one point to another. The velocity of a water rop is affecte an etermine only by the amount of soil between two points, while the amount of soil it carries is equal to the amount of soil being remove from a path [26]. The IWD has also been use to solve COPs. Shah- Hosseini [27] utilize binary variable values in the IWD an create a irecte graph of (M P) noes, in which M ientifies the number of variables an P ientifies the precision (number of bits for each variable), which was assume to be 32. The graph comprise 2 (M P) irecte eges connecting noes, with each ege having a value of zero or one, as epicte in Figure 1. One of the avantages of this representation is that it is intuitively consistent with the natural iea of water flowing in a specific irection. Also, the IWD algorithm was augmente with a mutation-base local search to enhance its solution quality. In this process, an ege is selecte at ranom from a solution an replace by another ege which is kept if it improves the fitness value, an the process was repeate 1 times for each solution. This mutation is applie to all the generate solutions,anthenthesoilisupateontheegesbelonging tothebestsolution.however,thevariables valueshaveto be converte from binary to real numbers to be evaluate. Also, once mutation (but not crossover) is introuce into IWD,thequestionarisesaswhatavantagesIWDhasover stanar evolutionary algorithms that use simpler notations for representing continuous numbers. Detaile ifferences between IWD an our HCA are escribe in Section 3.9. Various researchers have applie the WCA to COPs [28, 29]. The WCA is also base on the groun water flow through rivers an streams to the sea. The algorithm constructs a solution using a population of streams, where each stream is a caniate solution. It initially generates ranom solutions for the problem by istributing the streams into ifferent positions accoring to the problem s imensions. The best stream is consiere a sea an a specific number of streams areconsiererivers.thenthestreamsaremaetoflow towars the positions of the rivers an sea. The position of the sea inicates the global-best solution, whereas those of the rivers inicate the local best solution points. In each iteration, the position of each stream may be swappe with any of the river positions or (counterintuitively) the sea position accoring to the fitness of that stream. Therefore, if a stream solution is better than that of a river, that stream becomes a river, an the corresponing river becomes a stream. The same process occurs between rivers an sea. Evaporation occurs in the rivers/streams when the istance between a river/stream an the sea is very small (close to zero). Following evaporation, new streams are generate with ifferent positions in a process that can be escribe as rain. These processes help the algorithm to escape from local optima. WCA treats streams, rivers, an the sea as set of moving points in a multiimensional space an oes not consier stream/river formation (movements of water an soils). Therefore, the overall structure of the WCA is quite similar to that of the PSO algorithm. For instance, the PSO algorithm has Gbest an Pbest points that inicate the global an local solutions, with the Gbest point guiing the Pbest points to converge to that point. In a similar manner, the WCA has a number of rivers that represent the local optimal solutions, an a sea position represents the global-optimal solution. The sea is use as a guie for the streams an rivers. Moreover, in PSO, a particle at the Pbest point upates its position towars the Gbest point. Analogously, the WCA exchanges the positions of rivers with that of the sea when the fitness of the river is better. The main ifference is the consieration of the evaporation an raining stages that help the algorithm to escape from local optima. An improve version of WCA calle ual-system (DSWCA) has been presente for solving constraine engineering optimization problems [3]. The DSWCA improvements involve aing two cycles (outer an inner). The outer cycle isinaccorancewiththewcaanisesignetoperform exploration. The inner cycle which is base on the ocean

5 Journal of Optimization 5 cycle was esigne as an exploitation process. The confluence between rivers process is also inclue to improve convergencespee.theimprovementsaimtohelptheoriginal WCA to avoi falling into the local optimal solutions. The DSWCA was able to provie better results than WCA. In a further extension of WCA, Heiari et al. propose a chaotic WCA (CWCA) that incorporates thirteen chaotic patterns with WCA movement process to improve WCA s performance an eal with the premature convergence problem [31]. The experimental results emonstrate improvement in controlling the premature convergence problem. However, WCA an its extensions introuce many aitional features, many of which are not compatible with nature-inspiration, to improve performance an make WCA competitive with nonwater-base algorithms. Detaile ifferences between WCA an our HCA are escribe in Section 3.9. The review of previous approaches shows that natureinspire swarm approaches ten to introuce evolutionary concepts of mutation an crossover for sharing information, to aopt nonplausible global ata structures to prevent convergence to local optima or a more centralize control parameters to preserve the balance between exploration an exploitation in increasingly ifficult problem omains, thereby compromising self-organization. When such approaches are moifie to eal with COPs, complex an unnatural representations of numbers may also be require that a to the overheas of the algorithm as well as leaing to what may appear to be force an unnatural ways of ealing with the complexities of a continuous problem. In swarm algorithms, a number of cooperative homogenous entities explore an interact with the environment to achieve a goal, which is usually to fin the global-optimal solution to a problem through exploration an exploitation. Cooperation can be accomplishe by some form of communication that allows the swarm members to share exploration an exploitation information to prouce highquality solutions [32, 33]. Exploration aims to acquire as much information as possible about promising solutions, while exploitation aims to improve such promising solutions. Controlling the balance between exploration an exploitation is usually consiere critical when searching for more refine solutions as well as more iverse solutions. Also important is the amount of exploration an exploitation information to be share, an when. Information sharing in nature-inspire algorithms usually inclues metrics for evaluating the usefulness of information an mechanisms for how it shoul be share. In particular, these metrics an mechanisms shoul not contain more knowlege or require more intelligence than can be plausibly assigne to the swarm entities, where the emphasis is on emergence of complex behavior from relatively simple entities interacting in nonintelligent ways with each other. Information sharing can be one either ranomly or nonranomly among entities. Different approaches are use to share information such as irect or inirect interaction. Two sharing moels are: one-to-one, in which explicit interaction occurs between entities, an many-to-many (broacasting), in which interaction occurs inirectly between the entities through the environment [34]. Usually, either irect or inirect interaction between entities is employe in an algorithm for information sharing, antheuseofbothtypesinthesamealgorithmisoften neglecte. For instance, the ACO algorithm uses inirect communication for information sharing, where ants lay amounts of pheromone on paths epening on the usefulness of what they have explore. The more promising the path, the more the pheromone ae, leaing other ants to exploiting this path further. Pheromone is not irecte at any other ant in particular. In the artificial bee colony (ABC) algorithm, bees share information irectly (specifically, the irection anistancetoflowers)inakinofwaggleancing[35]. The information receive by other bees epens on which bee they observe an not the information gathere from all bees. In a GA, the information exchange occurs irectly in the form of crossover operations between selecte members (chromosomes an genes) of the population. The quality of information share epens on the members chosen an there is no overall store of information available to all members. In PSO, on the other han, the particles have their own information as well as access to global information to guie their exploitation. This can lea to competitive an cooperative optimization techniques [36]. However, no attempt is mae in stanar PSO to share information possesse by iniviual particles. This lack of iniviual-toiniviual communication can lea to early convergence of the entire population of particles to a nonoptimal solution. Selective information sharing between iniviual particles will not always avoi this narrow exploitation problem but, if unertaken properly, coul allow the particles to fin alternative an more iverse solution spaces where the global optimum lies. Ascanbeseenfromtheaboveiscussion,theistinctions between communication (irect an inirect) an information sharing (ranom or nonranom) can become blurre. In HCA, we utilize both irect an inirect communication to share information among selecte members of the whole population. Direct communication occurs in the conensation stage of the water cycle, where some water rops collie with each other when they evaporate into the clou. On the other han, inirect communication occurs between the water rops an the environment via the soil an path epth. Utilizing information sharing helps to iversify the search space an improve the overall performance through better exploitation. In particular, we show how HCA with irect an inirect communication suits continuous problems, where iniviual particles share useful information irectly when searching a continuous space. Furthermore, in some algorithms, inirect communication is use not just to fin a possible solution but also to reinforce an alreay foun promising solution. Such reinforcement may lea the algorithm to getting stuck in the same solution, which is known as stagnation behavior [37]. Such reinforcement can also cause a premature convergence in some problems. An important feature use in HCA to avoi this problem is the path epth. The epths of paths are constantly changing, where eep paths will have less chance of being chosen compare with shallow paths. More etails will be provie about this feature in Section 3.3.

6 6 Journal of Optimization In summary, this paper aims to introuce a novel natureinspire approach for ealing with COPs which also introuces a continuous number representation that is natural for the algorithm. The cyclic nature of the algorithm contributes to self-organization as the system converges to the solution through irect an inirect communication between particles, feeback from the environment, an internal selfevaluation. Finally, our HCA aresses some of the weaknesses of other similar algorithms. HCA is inspire by the full water cycle, not parts of it as exemplifie by IWD an WCA. 3. The Hyrological Cycle Algorithm The water cycle, also known as the hyrologic cycle, escribes the continuous movement of water in nature [38]. It is renewable because water particles are constantly circulating fromthelantotheskyanviceversa.therefore,theamount of water remains constant. Water rops move from one place to another by natural physical processes, such as evaporation, precipitation, conensation, an runoff. In these processes, the water passes through ifferent states Inspiration. The water cycle starts when surface water gets heate by the sun, causing water from ifferent sources to change into vapor an evaporate into the air. Then the water vapor cools an conenses as it rises to become rops. These rops combine an collect with each other an eventually become so saturate an ense that they fall back because of the force of gravity in a process calle precipitation. The resulting water flows on the surface of the Earth into pons, lakes, or oceans where it again evaporates back into the atmosphere. Then, the entire cycle is repeate. In the flow (runoff) stage, the water rops start moving from one location to another base on Earth s gravity an the groun topography. When the water is scattere, it chooses a path with less soil an fewer obstacles. Assuming no obstacles exist, water rops will take the shortest path towars the center of the Earth (i.e., oceans) owing to the force of gravity. Water rops have force because of this gravity an this force causes changes in motion epening on the topology. As the water flows in rivers an streams, it picks up seiments an transports them away from their original location. These processes are known as erosion an eposition. They help to create the topography of the groun by making new paths with more soil removal, or the faing of others as they become less likely to be chosen owing to too much soil eposition. These processes are highly epenent on water velocity. For instance, a river is continually picking up an ropping off seiments from one point to another. When the river flow is fast, soil particles are picke up, an the opposite happens when the river flow is slow. Moreover, there is a strong relationship between the water velocity an the suspension loa (the amount of issolve soil in the water) that it currently carries. The water velocity is higher when only a small amount of soil is being carrie an lower when larger amounts are carrie. In aition, the term flow rate or ischarge can be efine as the volume of water in a river passing a efine point over a specific perio. Mathematically, flow rate is calculatebaseonthefollowingequation[39]: Q=V A [Q =V (W D)], where Q is flow rate, A is cross-sectional area, V is velocity, W is with, an D is epth. The cross-sectional area is calculate by multiplying the epth by the with of the stream. The velocity of the flowing water is efine as the quantity of ischarge ivie by the cross-sectional area. Therefore, there is an inverse relationship between the velocity an the crosssectional area [4]. The velocity increases when the crosssectional area is small, an vice versa. If we assume that the river has a constant flow rate (velocity cross-sectional area =constant)anthatthewithoftheriverisfixe,thenwe can conclue that the velocity is inversely proportional to the epth of the river, in which case the velocity ecreases in eep water an vice versa. Therefore, the amount of soil eposite increases as the velocity ecreases. This explains why less soil is taken from eep water an more soil is taken from shallow water. A eep river will have water moving more slowly than ashallowriver. In aition to river epth an force of gravity, there are other factors that affect the flow velocity of water rops an cause them to accelerate or ecelerate. These factors inclue obstructions, amount of soil existing in the path, topography of the lan, variations in channel cross-sectional area, an the amount of issolve soil being carrie (the suspension loa). Water evaporation increases with temperature, with maximum evaporation at 1 C (boiling point). Conversely, conensation increases when the water vapor cools towars C. The evaporation an conensation stages play important roles in the water cycle. Evaporation is necessary to ensure that the system remains in balance. In aition, the evaporation process helps to ecrease the temperature an keeps the air moist. Conensation is a very important process as it completes the water cycle. When water vapor conenses, it releases the same amount of thermal energy into the environment that was neee to make it a vapor. However, in nature, it is ifficult to measure the precise evaporation rate owing to the existence of ifferent sources of evaporation; hence, estimation techniques are require [38]. The conensation process starts when the water vapor rises into the sky; then, as the temperature ecreases in the higher layer of the atmosphere, the particles with less temperature have more chance to conense [41]. Figure 2(a) illustrates the situation where there are no attractions (or connections) between the particles at high temperature. As the temperature ecreases, the particles collie an agglomerate to form small clusters, leaing to clous, as shown in Figure 2(b). An interesting phenomenon in which some of the large water rops may eliminate other smaller water rops occurs uring the conensation process as some of the water rops known as collectors fall from a higher layer to a lower layer in the atmosphere (in the clou). Figure 3 epicts the action of a water rop falling an colliing with smaller (3)

7 Journal of Optimization 7 (a) Hotwater (b) Col water Figure 2: Illustration of the effect of temperature on water rops. Figure 3: A collector water rop in action. water rops. Consequently, the water rop grows as a result of coalescence with the other water rops [42]. Base on this phenomenon, only water rops that are sufficiently large will survive the trip to the groun. Of the numerous water rops in the clou, only a portion will make it to the groun The Propose Algorithm. Given the brief escription of the hyrological cycle above, the IWD algorithm can be interpreteascoveringonlyonestageoftheoverallwater cycle the flow stage. Although the IWD algorithm takes into account some of the characteristics an factors that affect the flowing water rops through rivers, an the actions an reactions along the way, the IWD algorithm neglects other factors that coul be useful in solving optimization problems through aoption of other key concepts taken from the full hyrological process. Stuyingthefullhyrologicalwatercyclegaveusthe inspiration to esign a new an potentially more powerful algorithm within which the original IWD algorithm can be consiere a subpart. We ivie our algorithm into four main stages: Flow (Runoff), Evaporation, Conensation, an Precipitation. The HCA can be escribe formally as consisting of a set of artificial water rops (WDs), such that HCA ={WD 1, WD 2, WD 3,...,WD n }, where n 1. The characteristics associate with each water rop are as follows: (i) V WD : the velocity of the water rop. (4) (ii) Soil WD : the amount of soil the water rop carries. (iii) ψ WD : the solution quality of the water rop. The input to the HCA is a graph representation of a problems solution space. The graph can be a fully connecte unirecte graph G = (N, E),whereNis the set of noes an E is the set of eges between the noes. In orer to fin a solution, the water rops are istribute ranomly over the noes of the graph an then traverse the graph (G), accoring to various rules, via the eges (E) searching for the best or optimal solution. The characteristics associate with each ege are the initialamountofsoilontheegeantheegeepth. The initial amount of soil is the same for all the eges. The epth measures the istance from the water surface to the riverbe, an it can be calculate by iviing the length of the ege by the amount of soil. Therefore, the epth varies an epens on the amount of soil that exists on the path an its length. The epth of the path increases when more soil is remove over the same length. A eep path can be interprete as either the path being lengthy or there being less soil. Figures 4 an 5 illustrate the relationship between path epth, soil, an path length. The HCA goes through a number of cycles an iterations to fin a solution to a problem. One iteration is consiere complete when all water rops have generate solutions base on the problem constraints. A water rop iteratively constructs a solution for the problem by continuously moving between the noes. Each iteration consists of specific steps (which will be explaine below). On the other han, a cycle represents a varie number of iterations. A cycle is consiere as being complete when the temperature reaches a specific value, which makes the water rops evaporate, conense, an precipitate again. This proceure continues until the termination conition is met Flow Stage (Runoff). This stage represents the construction stage where the HCA explores the search space. This is carrie out by allowing the water rops to flow (scattere or regroupe) in ifferent irections an constructing various solutions. Each water rop constructs a

8 8 Journal of Optimization Depth = 1/1 =.1 Depth = 2/5 =.4 Soil = 5 Soil = 5 Length = 1 Length = 2 Figure 4: Depth increases with length increases for the same amount of soil. Depth = 1/1 =.1 Depth = 1/5 =.2 Soil = 1 Soil = 5 Length = 1 Length = 1 Figure 5: Depth increases when the amount of soil is reuce over the same length. solution incrementally by moving from one point to another. Whenever a water rop wants to move towars a new noe, it has to choose between ifferent numbers of noes (various branches). It calculates the probability of all the unvisite noesanchoosesthehighestprobabilitynoetakinginto consieration the problem constraints. This can be escribe as a state transition rule that etermines the next noe to visit. In HCA, the noe with the highest probability will be selecte. The probability of the noe is calculate using P WD i (j) = f(soil (i, j)) 2 g(depth (i, j)) k Vc(WD) (f (Soil (i, k)) 2 g(depth (i, k))), (5) where P WD i(j) is the probability of choosing noe j from noe i. f(soil(i, j)) is equal to the inverse of the soil between i an j aniscalculateusing f(soil (i, j)) = 1 ε+soil (i, j). (6) ε =.1 isasmallvaluethatisusetopreventivisionby zero. The secon factor of the transition rule is the inverse of epth, which is calculate base on g (Depth (i, j)) = 1 Depth (i, j). (7) Depth(i, j) is the epth between noes i an j an is calculate by iviing the length of the path by the amount of soil. This can be expresse as follows: Length (i, j) Depth (i, j) =. (8) Soil (i, j) The epth value can be normalize to be in the range [1 1], as it might be very small. The epth of the path is constantly changing because it is influence by the amount of existing S A B S Figure 6: The relationship between soil an epth over the same length. soil, where the epth is inversely proportional to the amount of the existing soil in the path of a fixe length. Water rops will choose paths with less epth over other paths. This enables exploration of new paths an iversifies the generate solutions. Assume the following scenario (epicte by Figure 6), where water rops have to choose between noes A or B after noe S. Initially, both eges have the same length an same amount of soil (line thickness represents soil amount). Consequently, the epth values for the two eges willbethesame.aftersomewaterropschosenoea to visit, the ege (S-A) as a result has less soil an is eeper. Accoring to the new state transition rule, the next water rops will be force to choose noe B becauseege(s-b)willbeshallower than (S-A). This technique provies a way to avoi stagnation an explore the search space efficiently Velocity Upate. After selecting the next noe, the water rop moves to the selecte noe an marks it as visite. Each waterrophasavariablevelocity(v). The velocity of a water rop might be increase or ecrease while it is moving base on the factors mentione above (the amount of soil existing on the path, the epth of the path, etc.). Mathematically, the velocity of a water rop at time (t + 1) can be calculate using V WD t+1 =[K VWD t ]+α( VWD Soil (i, j) )+ 2 VWD Soil WD A +( 1 ψ WD )+ 2 VWD Depth (i, j). B S A B (9)

9 Journal of Optimization 9 Equation (9) efines how the water rop upates its velocity after each movement. Each part of the equation has an effect onthenewvelocityofthewaterrop,wherev WD t is the current water rop velocity an K is a uniformly istribute ranom number between [, 1] that refers to the roughness coefficient. The roughness coefficient represents factors that may cause the water rop velocity to ecrease. Owing to ifficulties measuring the exact effect of these factors, some ranomness is consiere with the velocity. The secon term of the expression in (9) reflects the relationship between the amount of soil existing on a path an the velocity of a water rop. The velocity of the water rops increases when they traverse paths with less soil. Alpha (α) is a relative influence coefficient that emphasizes this part in the velocity upate equation. The thir term of the expression represents the relationship between the amount of soil that a water rop is currently carrying an its velocity. Water rop velocity increases with ecreasing amount of soil being carrie. This gives weak water rops a chance to increase their velocity, as they o not hol much soil. The fourth term of the expression refers to the inverse ratio of a water rop s fitness, with velocity increasing with higher fitness. The final term of the expression inicates that a water rop will be slower in a eep path than in a shallow path Soil Upate. Next, the amount of soil existing on the path an the epth of that path are upate. A water rop can remove (or a) soil from (or to) a path while moving base on its velocity. This is expresse by 1 [PN Soil (i, j)] ΔSoil (i, j) { 2 Depth (i, j) Soil (i, j) = 1 {[PN Soil (i, j)] + ΔSoil (i, j) + 2 { Depth (i, j) if V WD Avg (all V WDS ) (Erosion) else (Deposition). (1) PN represents a coefficient (i.e., seiment transport rate or graation coefficient) that may affect the reuction in the amount of soil. The soil can be remove only if the current water rop velocity is greater than the average of all other water rops. Otherwise, an amount of soil is ae (soil eposition). Consequently, the water rop is able to transfer an amount of soil from one place to another an usually transfers it from fast to slow parts of the path. The increasing soil amount on some paths favors the exploration of other paths uring the search process an avois entrapment in local optimal solutions. The amount of soil existing between noe i an noe j is calculate an change using 1 such that ΔSoil (i, j) = time WD i,j time WD i,j, (11) Distance (i, j) =. (12) V WD t+1 Equation (11) shows that the amount of soil being remove is inversely proportional to time. Base on the secon law of motion, time is equal to istance ivie by velocity. Therefore, time is proportional to velocity an inversely proportional to path length. Furthermore, the amount of soil being remove or ae is inversely proportional to the epth of the path. This is because shallow soils will be easy to remove compare to eep soils. Therefore, the amount of soil being remove will ecrease as the path epth increases. This factor facilitates exploration of new paths because less soil is remove from the eep paths as they have been use many times before. This may exten the time neee to search for anreachoptimalsolutions.theepthofthepathneestobe upate when the amount of soil existing on the path changes using (8) Soil Transportation Upate. In the HCA, we consier the fact that the amount of soil a water rop carries reflects its solution quality. This can be one by associating the quality of the water rop s solution with its carrying soil value. Figure 7 illustrates the fact that a water rop with more carrying soil has a better solution. To simulate this association, the amount of soil that the water rop is carrying is upate with a portion of the soil being remove from the path ivie by the last solution quality foun by that water rop. Therefore, the water rop with a better solution will carry more soil. This can be expresse as follows: Soil WD = Soil WD + ΔSoil (i, j) ψ WD. (13) The iea behin this step is to let a water rop preserves its previous fitness value. Later, the amount of carrying soil value is use in upating the velocity of the water rops Temperature Upate. The final step that occurs at this stage is upating of the temperature. The new temperature value epens on the solution quality generate by the water rops in the previous iterations. The temperature will be upate as follows: where Temp (t+1) = Temp (t) ++ΔTemp, (14) { β ( ΔTemp = { { Temp (t) 1 Temp (t) ΔD ) ΔD > otherwise (15)

10 1 Journal of Optimization Better Figure 7: Relationship between the amount of carrying soil an its quality. an where coefficient β is etermine base on the problem. The ifference (ΔD) is calculate base on such that ΔD = MaxValue MinValue, (16) MaxValue = max [Solutions Quality (WDs)], MinValue = min [Solutions Quality (WDs)]. (17) Accoring to (16), increase in temperature will be affecte by the ifference between the best solution (MinValue) an the worst solution (MaxValue). Less ifference means that there was not a lot of improvement in the last few iterations an, as a result, the temperature will increase more. This means that there is no nee to evaporate the water rops as long as they canfinifferentsolutionsineachiteration.theflowstage will repeat a few times until the temperature reaches a certain value. When the temperature is sufficient, the evaporation stage begins Evaporation Stage. In this stage, the number of water rops that will evaporate (the evaporation rate) when the temperature reaches a specific value is first etermine. The evaporation rate is chosen ranomly between one an the total number of water rops: ER = Ranom (1, N), (18) where ER is evaporation rate, an N is number of water rops. Accoring to the evaporation rate, a specific number of water ropsisselectetoevaporate.theselectionisoneusingthe roulette-wheel selection metho, taking into consieration the solution quality of each water rop. Only the selecte water rops evaporate an go to the next stage Conensation Stage. As the temperature ecreases, water rops raw closer to each other an then collie an combine or bounce off. These operations are useful as they allow the water rops to be in irect physical contact with each other. This stage represents the collective an cooperative behavior between all the water rops. A water rop can generate a solution without irect communication (by itself); however, communication between water rops may lea to the generation of better solutions in the next cycle. In HCA, physical contact is use for irect communication between water rops, whereas the amount of soil on the eges can be consiere inirect communication between the water rops. Direct communication (information exchange) has been proven to improve solution quality significantly when applie by other algorithms [37]. The conensation stage is consiere to be a problemepenent stage that can be reesigne to fit the problem specification. For example, one way of implementing this stage to fit the Travelling Salesman Problem (TSP) is to use local improvement methos. Using these methos enhances the solution quality an generates better results in the next cycle (i.e., minimizes the total cost of the solution), with the hypothesis that it will reuce the number of iterations neee anhelptoreachtheoptimal/near-optimalsolutionfaster. Equation (19) shows the evaporate water (EW) selecte from the overall population, where n represents the evaporation rate (number of evaporate water rops): EW = {WD 1, WD 2,...,WD n }. (19) Initially, all the possible combinations (as pairs) are selecte from the evaporate water rops to share their information with each other via collision. When two water rops collie, there is a chance either of the two water rops merging (coalescence)orofthembouncingoffeachother.innature, etermining which operation will take place is base on factorssuchasthetemperatureanthevelocityofeachwater rop. In this research, we consiere the similarity between the water rops solutions to etermine which process will occur. When the similarity is greater than or equal to 5% between two water rops solutions, they merge. Otherwise, they collie an bounce off each other. Mathematically, this can be represente as follows: OP (WD 1, WD 2 ) = { Bounce (WD 1, WD 2 ), Similarity <5% { Merge (WD { 1, WD 2 ), Similarity 5%. (2) We use the Hamming istance [43] to count the number of ifferentpositionsintwoseriesthathavethesamelengthto obtain a measure of the similarity between water rops. For example, the Hamming istance between an is two. When two water rops collie an merge, one water rop (i.e., the collector) will become more powerful by eliminating theotheroneintheprocessalsoacquiringthecharacteristics of the eliminate water rop (i.e., its velocity): WD1 Vel = WD1 Vel + WD2 Vel. (21) On the other han, when two water rops collie an bounce off, they will share information with each other about the gooness of each noe an how much a noe contributes to their solutions. The bounce-off operation generates information that is use later to refine the quality of the water rops solutions in the next cycle. The information is available to all water rops an helps them to choose a noe that has a better contribution from all the possible noes at the flow stage. The information consists of the weight of each noe. The weight measures a noe occurrence within a solution over the water rop solution quality. The iea behin sharing these etails is

11 Journal of Optimization 11 to emphasize the best noes foun up to that point. Within this exchange, the water rops will favor those noes in the nextcycle.mathematically,theweightcanbecalculateas follows: Weight (noe) = Noe occurrence WD sol. (22) Finally,thisstageisusetoupatetheglobal-bestsolution foun up to that point. In aition, at this stage, the temperature is reuce by 5 C. The lowering an raising of the temperature helps to prevent the water rops from sticking with the same solution every iteration Precipitation Stage. This stage is consiere the termination stage of the cycle, as the algorithm has to check whether the termination conition is met. If the conition has been met, the algorithm stops with the last global-best solution. Otherwise, this stage is responsible for reinitializing all the ynamicvariables,suchastheamountofthesoiloneach ege, epth of paths, the velocity of each water rop, an the amount of soil it hols. The reinitialization of the parameters helps the algorithm to avoi being trappe in local optima, which may affect the algorithm s performance in the next cycle. Moreover, this stage is consiere as a reinforcement stage, which is use to place emphasis on the best water rop (the collector). This is achieve by reucing the amount of soil on the eges that belong to the best water rop solution: Soil (i, j) =.9 soil (i, j), (i, j) Best WD. (23) The iea behin that is to favor these eges over the other eges in the next cycle Summarization of HCA Characteristics. The HCA can be istinguishe from other swarm algorithms in the following ways: (1)HCAinvolvesanumberofcyclesannumberof iterations (multiple iterations per cycle). This gives the avantage of performing certain tasks (e.g., solution improvements methos) every cycle instea of every iteration to reuce the computational effort. In aition, the cyclic nature of the algorithm contributes to self-organization as the system converges to the solution through feeback from the environment an internal self-evaluation. The temperature variable controls the cycle, an its value is upate accoring to the quality of the solution obtaine in every iteration. (2)Theflowofthewaterropsiscontrollebypaths epth an soil amount heuristics (inirect communication). Paths epth factor affects the movement of water rops an helps to avoi choosing the same paths by all water rops. (3) Water rops are able to gain or lose portion of their velocity. This iea supports the competition between water rops an prevents the sweep of one rop for the rest of the rops because a high-velocity water HCA proceure (i) Problem input (ii) Initialization of parameters (iii) While (termination conition is not met) Cycle = Cycle + 1 %Flowstage Repeat (a) Iteration = iteration + 1 (b) For each water rop (1) Choose next noe (Eqs. (5) (8)) (2) Upate velocity (Eq. (9)) (3) Upate soil an epth (Eqs. (1)-(11)) (4) Upate carrying soil (Eq. (13)) (c) En for () Calculate the solution fitness (e) Upate local optimal solution (f) Upate Temperature (Eqs. (14) (17)) Until (Temperature = Evaporation Temperature) Evaporation (WDs) Conensation (WDs) Precipitation (WDs) (iv) En while Pseuocoe 1: HCA pseuocoe. rop can carve more paths (i.e., remove more soils) than slower one. (4) Soil can be eposite an remove, which helps to avoi a premature convergence an stimulates the sprea of rops in ifferent irections (more exploration). (5) Conensation stage is utilize to perform information sharing (irect communication) among the water rops. Moreover, selecting a collector water rop represents an elitism technique, an that helps to perform exploitation for the goo solutions. Further, this stage can be use to improve the quality of the obtaine solutions to reuce the number of iterations. (6) The evaporation process is a selection technique an helps the algorithm to escape from local optima solutions. The evaporation happens when there are no improvements in the quality of the obtaine solutions. (7) Precipitation stage is use to reinitialize variables an reistribute WDs in the search space to generate new solutions. Conensation, evaporation, an precipitation ((5), (6), an (7)) above istinguish HCA from other water-base algorithms, incluing IWD an WCA. These characteristics are important an help make a proper balance between exploration exploitation, which helps in convergence towars the global solution. In aition, the stages of the water cycle are complementary to each other an help improve the overall performance of the HCA The HCA Proceure. Pseuocoes 1 an 2 show the HCA pseuocoe.

12 12 Journal of Optimization Evaporation proceure (WDs) Evaluate the solution quality Ientify Evaporation Rate (Eq. (18)) Select WDs to evaporate En Conensation proceure (WDs) Information exchange (WDs) (Eqs. (2) (22)) Ientify the collector (WD) Upate global solution Upate the temperature En Precipitation proceure (WDs) Evaluate the solution quality Reinitialize ynamic parameters Global soil upate (belong to best WDs) (Eq. (23)) Generate a new WDs population Distribute the WDs ranomly En Pseuocoe 2: The Evaporation, Conensation, an Precipitation pseuocoe Similarities an Differences in Comparison to Other Water-Inspire Algorithms. In this section, we clarify the major similarities an ifferences between the HCA an other algorithms such as WCA an IWD. These algorithms share the same source of inspiration water movement in nature. However, they are inspire by ifferent aspects of the water processes an their corresponing events. In WCA, entities are represente by a set of streams. These streams keep moving from one point to another, which simulates the flow process of the water cycle. When a better point is foun, the position of the stream is swappe with that of a river or the sea accoring to their fitness. The swap process simulates the effects of evaporation an rainfall rather than moelling these processes. Moreover, these effects only occur when the positions of the streams/rivers are very close to that of the sea. In contrast, the HCA has a population of artificial water rops that keep moving from one point to another, which represents theflowstage.whilemoving,waterropscancarry/rop soil that change the topography of the groun by making some paths eeper or faing others. This represents the erosion an eposition processes. In WCA, no consieration is mae for soil removal from the paths, which is consiere a critical operation in the formation of streams an rivers. In HCA, evaporation occurs after certain iterations when the temperature reaches a specific value. The temperature is upate accoring to the performance of the water rops. In WCA there is no temperature an the evaporation rate isbaseonaratiobaseonqualityofsolution.another major ifference is that there is no consieration for the conensation stage in WCA, which is one of the crucial stages in the water cycle. By contrast, in HCA, this stage is utilize to implement the information sharing concept between the water rops. Finally, the parameters, operations, exploration techniques, the formalization of each stage, an solution construction iffer in both algorithms. Despite the fact that the IWD algorithm is use with majormoificationsasasubpartofhca,therearemajor ifferences between IWD an HCA. In IWD, the probability of selecting the next noe is base on the amount of soil. In contrast, in HCA, the probability of selecting the next noe is an association between the soil an the path epth, which enables the construction of a variety of solutions. This association is useful when the same amount of soil exists on ifferent paths; therefore, the paths epths are use to ifferentiate between these eges. Moreover, this association helps in creating a balance between the exploitation an exploration of the search process. The ege with less epth is chosen, an this favors the exploration. Alternatively, choosing the ege with less soil will favor exploitation. Further, in HCA, aitional factors such as amount of soil in comparison to epth are consiere in velocity upating, which allows the water rops to gain or lose velocity. This gives other water rops a chance to compete, as the velocity of water rops plays an important role in guiing the algorithm to iscover new paths. Moreover, the soil upate equation enables soil removal an eposition. The unerlying iea is to help the algorithm to improve its exploration, avoi premature convergence, an avoi being trappe in local optima. In IWD only inirect communication is consiere, while irect an inirect communications are consiere in HCA. Furthermore, in HCA, the carrying soil is encoe with thesolutionquality;morecarryingsoilinicatesabettersolution. Finally, in HCA, three critical stages (i.e., evaporation, conensation, an precipitation) are consiere to improve the performance of the algorithm. Table 1 summarizes the major ifferences between IWD, WCA, an HCA. 4. Experimental Setup In this section, we explain how the HCA is applie to continuous problems Problem Representation. Typically, the input of the HCA is represente as a graph in which water rops can travel between noes using the eges. In orer to apply the HCA over the COPs, we evelope a irecte graph representation that exploits a topographical approach for continuous number representation. For a function with N variables an precision P for each variable, the graph is compose of (N P) layers. In each layer, there are ten noes labelle from zero to nine. Therefore, there are (1 N P) noes in the graph, asshowninfigure8. Thisgraphcanbeseenasatopographicalmountain with ifferent layers or contour lines. There is no connection between the noes in the same layer; this prevents a water rop from selecting two noes from the same layer. A connection exists only from one layer to the next lower layer, in which a noe in a layer is connecte to all the noes in the next layer. The value of a noe in layer i is higher than that of a noe in layer i+1. This can be interprete as the selecte number from the first layer being multiplie by.1. The selecte number from the secon layer is foun by multiplying by.1, an so on. This can be one easily using X value = m L=1 n 1 L, (24)

13 Journal of Optimization 13 Table 1: A summary of the main ifferences between IWD, WCA, an HCA. Criteria IWD WCA HCA Flow stage Moving water rops Changing streams positions Moving water rops Choosing the next noe Base on the soil Base on river an sea positions Base on soil an path epth Velocity Always increases N/A Increases an ecreases Soil upate Removal N/A Removal an eposition Carrying soil Equal to the amount of soil remove from a path N/A Is encoe with the solution quality Evaporation Evaporation rate N/A N/A When the river position is very close to the sea position If the istance between a river an the sea is less than the threshol Conensation N/A N/A Precipitation N/A Ranomly istributes a number of streams Base on the temperature value, which is affecte by the percentage of improvements in the last set of iterations Ranom number Enable information sharing (irect communication). Upate the global-best solution, which is use to ientify the collector Reinitializes ynamic parameters such as soil, velocity, an carrying soil where L is the layer number, m is the maximum layer number for each variable (the precision igit s number), an n is the noe in that layer. The water rops will generate values between zero an one for each variable. For example, if a water rop moves between noes 2, 7, 5, 6, 2, an 4, respectively, then the variable value is The main rawback of this representation is that an increase in the number of high-precision variables leas to an increase in search space Variables Domain an Precision. As state above, a function has at least one variable or up to n variables. The variables values shoul be within the function omain, which efines a set of possible values for a variable. In some functions,allthevariablesmayhavethesameomainrange, whereas, in others, variables may have ifferent ranges. For simplicity, the obtaine values by a water rop for each variablecanbescaletohavevaluesbaseontheomain of each variable: V value = (U B L B ) Value +L B, (25) where V value is the real value of the variable, U B is upper boun, an L B is lower boun. For example, if the obtaine valueis anthevariable somainis[ 1, 1],then therealvaluewillbeequalto thevaluesofcontinuous variables are expresse as real numbers. Therefore, the precision (P) of the variables values has to be ientifie. This precision ientifies the number of igits after the ecimal point. Dealing with real numbers makes the algorithm faster than is possible by simply consiering a graph of binary numbers as there is no nee to encoe/ecoe the variables values to an from binary values Solution Generator. To fin a solution for a function, anumberofwaterropsareplaceonthepeakofthis s Figure 8: Graph representation of continuous variables. continuous variable mountain (graph). These rops flow own along the mountain layers. A water rop chooses only one number (noe) from each layer an moves to the next layer below. This process continues until all the water rops reach the last layer (groun level). Each water rop may generate a ifferent solution uring its flow. However, as the algorithm iterates, some water rops start to converge towars the best water rop solution by selecting the highest noe s probability. The caniate solutions for a function can be represente as a matrix, in which each row consists of a

14 14 Journal of Optimization Parameter Table 2: HCA parameters an their values. Value Number of water rops 1 Maximum number of iterations 5/1 Initial soil on each ege 1 Initial velocity 1 Initial epth Ege length/soil on that ege Initial carrying soil 1 Velocity upating α=2 Soil upating PN =.99 Initial temperature 5, β=1 Maximum temperature 1 water rop solution. Each water rop will generate a solution to all the variables of the function. Therefore, the water rop solution is compose of a set of visite noes base on the number of variables an the precision of each variable: Solutions = [ [ WD 1 s WD 2 s WD i s WD k s 1 2 m N P 1 2 m N P 1 2 m N P. (26). ] 1 2 m N P ] An initial solution is generate ranomly for each water rop base on the function imensionality. Then, these solutions are evaluate, an the best combination is selecte. The selecte solution is favore with less soil on the eges belonging to this solution. Subsequently, the water rops start flowing an generating solutions Local Mutation Improvement. The algorithm can be augmente with mutation operation to change the solution of the water rops uring collision at the conensation stage. Themutationisapplieonlyontheselectewaterrops. Moreover, the mutation is applie ranomly on selecte variables from a water rop solution. For real numbers, a mutation operation can be implemente as follows: V new =1 (β V ol ), (27) where β is a uniform ranom number in the range [, 1], V new isthenewvalueafterthemutation,anv ol is the original value. This mutation helps to flip the value of the variable; if the value is large, then it becomes small, an vice versa HCA Parameters an Assumption. IntheHCA,anumber of parameters are consiere to control the flow of the algorithm an to help to generate a solution. Some of these parameters nee to be ajuste accoring to the problem omain. Table 2 lists the parameters an their values use in this algorithm after initial experiments. The istance between the noes in the same layer is not specifie (i.e., no connection) because a water rop cannot choose two noes from the same layer. The istance between the noes in ifferent layers is the same an equal to one unit. Therefore, the epth is ajuste to equal the inverse of the number of times a noe is selecte. Uner this setting, a path between (x, y)willeepenwithincreasingnumberof selections of noe y after noe x. This ajustment is require because the internoal istance is constant as stipulate in the problem representation. Hence, consiering the epth to equaltheistanceoverthesoil(see(8))willnotaffectthe outcome as suppose Experimental Results an Analysis. Anumberofbenchmark functions were selecte from the literature; see [4] for a full escription of these functions. The mathematical formulationsoftheusefunctionsarelisteintheappenix. The functions have a variety of properties with ifferent types. In this paper, only unconstraine functions are consiere. The experiments were conucte on a PC with a Core i5 CPU an 8 GB RAM using MATLAB on Microsoft Winows 7. The characteristics of these functions are summarize in Table 3. For all the functions, the precision is assume to be six significant figures for each variable. The HCA was execute twentytimeswithamaximumnumberofiterationsof5for all functions, except for those with more than three variables, for which the maximum was 1. The algorithm was teste without mutation (HCA) an with mutation (MHCA) for all the functions. The results are provie in Table 4. The algorithm numbersintable4(thecolumnname Number )mapsto thesamecolumnintable3.foreachfunction,theworst, average, an best values are liste. The symbol # represents the average number of iterations neee to reach the global solution. The results show that the algorithm gave satisfactory results for all of the functions teste, an it was able to fin the global minimum solution within a small number of iterations for some functions. In orer to evaluate the algorithm performance, we calculate the success rate for each function using Number of successful runs Success rate% =( ) Total number of runs (28) 1. The solution is accepte if the ifference between the obtaine solution an the optimum value is less than.1. The algorithm achieve 1% for all of the functions, except for Powell-4v [HCA: (6%), MHCA: (9%)], Sphere- 6v [HCA: (6%), MHCA: (4%)], Rosenbrock-4v [HCA: (7%), MHCA: (1%)], an Woo-4v [HCA: (5%), MHCA: (8%)]. The numbers highlighte in bolface text in the best column represent the best value achieve in the cases where HCA or MHCA performe best; in all other cases HCA an MHCA were foun to give the same best performance. The results of HCA an MHCA were compare using the Wilcoxon Signe-Rank test. The P values obtainein the test were.246,.1389,.8572, an.2543 for the worst, average, best, anaverage number of iterations, respectively. Accoring to the P values, there is no significant ifference between the

15 Journal of Optimization 15 Table 3: Functions characteristics. Number Function name Lower boun Upper boun D f(x ) Type (1) Ackley Many local minima (2) Cross-In-Tray Many local minima (3) Drop-Wave Many local minima (4) Gramacy & Lee (212) Many local minima (5) Griewank Many local minima (6) Holer Table Manylocalminima (7) Levy Many local minima (8) Levy N Many local minima (9) Rastrigin Many local minima (1) Schaffer N Manylocalminima (11) Schaffer N Manylocalminima (12) Schwefel Many local minima (13) Shubert Many local minima (14) Bohachevsky Bowl-shape (15) Rotate Hyper-Ellipsoi Bowl-shape (16) Sphere (Hyper) Bowl-shape (17) Sum Squares Bowl-shape (18) Booth Plate-shape (19) Matyas Plate-shape (2) McCormick [ 1.5, 4] [ 3, 4] Plate-shape (21) Zakharov Plate-shape (22) Three-Hump Camel Valley-shape (23) Six-Hump Camel [ 3, 3] [ 2, 2] Valley-shape (24) Dixon-Price Valley-shape (25) Rosenbrock Valley-shape (26) Shekel s Foxholes (De Jong N. 5) Steep riges/rops (27) Easom Steep riges/rops (28) Michalewicz Steep riges/rops (29) Beale Other (3) Branin [ 5, 1] [, 15] Other (31) Golstein-Price I Other (32) Golstein-Price II Other (33) Styblinski-Tang Other (34) Gramacy & Lee (28) Other (35) Martin & Gay 1 2 Other (36) Easton an Fenton Other (37) Rosenbrock D Other (38) Sphere Other (39) Hartmann 3-D Other (4) Rosenbrock Other (41) Powell Other (42) Woo Other (43) Sphere Other (44) DeJong Maximize function

16 16 Journal of Optimization Number Table 4: The obtaine results with no mutation (HCA) an with mutation (MHCA). HCA MHCA Worst Average Best # Worst Average Best # (1) 8.88E E E E E E (2) 2.6E + 2.6E + 2.6E E + 2.6E + 2.6E (3) 1.E + 1.E + 1.E E + 1.E + 1.E (4) 8.69E E E E E E (5).E +.E +.E E +.E +.E (6) 1.92E E E E E E (7) 4.23E E 7 1.5E E E 8 1.5E (8) 1.63E 5 4.7E E E 6 3.6E 6 1.6E (9).E +.E +.E E +.E +.E (1).E +.E +.E E +.E +.E (11) 2.93E E E E E E (12) 6.82E E 4 2.8E E E 4 2.2E (13) 1.87E E E E E E (14).E +.E +.E E +.E +.E (15).E +.E +.E E +.E +.E (16).E +.E +.E E +.E +.E (17).E +.E +.E E +.E +.E (18) 2.3E E 6.E E E E (19).E +.E +.E E +.E +.E (2) 1.91E E E E E E (21) 1.12E 4 5.7E E E 4 1.3E E (22).E +.E +.E E +.E +.E (23) 1.3E + 1.3E + 1.3E E + 1.3E + 1.3E (24) 3.8E E E E E 4 4.1E (25) 2.3E E 1.E E E 11.E (26) 9.98E E E E E E (27) 9.97E E 1 1.E E E 1 1.E (28) 1.8E + 1.8E + 1.8E E + 1.8E + 1.8E (29) 9.25E E E E E E (3) 3.98E E E E E E (31) 3.E + 3.E + 3.E E + 3.E + 3.E (32) 1.E + 1.E + 1.E E + 1.E + 1.E (33) 7.83E E E E E E (34) 4.29E E E E E E (35).E +.E +.E E +.E +.E (36) 1.74E E E E E E (37) 9.22E E E E 6 2.6E E (38) 4.43E E 8.E E 8 4.5E 9.E (39) 3.86E E E E E E (4) 1.94E 1 7.2E E E 2 3.4E E 3 86 (41) 2.42E E E E E E (42) 1.12E E E E 1 6.1E 2 1.E (43) 1.5E E E E 1 3.1E E (44) 3.91E E E E E E Mean

17 Journal of Optimization 17 Table 5: Average execution time per number of variables. Hypersphere function Average execution time (secon) 2variables (5 iterations) 3variables (5 iterations) 4variables (1 iterations) 6variables (1 iterations) results. This inicates that the HCA algorithm is able to obtain optimal results without the mutation operation. However, it shoul be note that using the mutation operation ha the avantage of reucing the average number of iterations require to converge on a solution by 61%. The success rate was lower for functions with more than four variables because of the problem representation, where the number of layers in the representation increases with the number of variables. Consequently, the HCA performance was limite to functions with few variables. The experimental resultsrevealeanotableavantageoftheproposeproblem representation, namely, the small effect of the function omain on the solution quality an algorithm performance. In contrast, the performances of some algorithms are highly affecte by the function omain, because their working mechanism epens on the istribution of the entities in a multiimensional search space. If an entity waners outsie the function bounaries, its position must be reset by the algorithm. The effectiveness of the algorithm is valiate by analyzing the calculate average values for each function (i.e., the average value of each function is close to the optimal value). This emonstrates the stability of the algorithm since it manages to fin the global-optimal solution in each execution. Since the maximum number of iterations is fixe to a specific value, the average execution time was recore for Hypersphere function with ifferent number of variables. Consequently, the execution time is approximately the same fortheotherfunctionswiththesamenumberofvariables. There is a slight increase in execution time with increasing number of variables. The results are reporte in Table 5. Base on the literature, the most commonly use criteria for evaluating algorithms are number of iterations (function evaluations), success rate, an solution quality (accuracy). In solving continuous problems, the performance of an algorithm can be evaluate using the number of iterations rather than execution time or solution accuracy. The aim of evaluatingthenumberofiterationsistoinfertheconvergence spee of the entities of an algorithm towars the globaloptimal solution. Another aim is to remove harware aspects from consieration. Generally speaking, premature or slow convergence is not preferre because it may lea to trapping in local optima solutions. The performance of the HCA was also compare with that of the WCA on specific functions, where the WCA results were taken from [29]. The obtaine results for both algorithmsareisplayeintable6.thesymbol # represents the average number of iterations. The results presente in Table 6 show that HCA an WCA prouce optimal results with iffering accuracies. Significance values of.75 (worst),.97(average), an.86(best) were foun using Wilcoxon Signe Ranks Test, inicating no significant ifference in performance between WCA an HCA. In aition, the average number of iterations is also compare, an the P value (.378) inicates a significant ifference, which confirms that the HCA was able to reach the global-optimal solution with fewer iterations than WCA (in 1 of 15 cases). However, the solutions accuracy of the HCA over functions with more than two variables was lower than WCA. HCA is also compare with other optimization algorithms in terms of average number of iterations. The compare algorithms were continuous ACO base on Discrete Encoing CACO-DE [44], Ant Colony System (ANTS), GA, BA, an GEM using results from [25]. WCA an ER- WCA were also compare with results taken from [29]. The success rate was 1% for all the algorithms, incluing HCA, except for Sphere-6v [HCA: (6%)] an Rosenbrock- 4v [HCA: (7%)]. Table 7 shows the comparison results. AscanbeseeninTable7,theHCAresultsareverycompetitive. We applie Wilcoxon test to ientify the significance of ifferences between the results. We also applie T-test to obtain P values along with the W-values. The results of the P/W-values are reporte in Table 8. Base on Table 8, there are significant ifferences between the results of ANTS, GA, BA, an HCA, where HCA reache the optimal solutions in fewer iterations than the ANTS, GA, an BA. Although the P value for BA versus HCA inicates that there is no significant ifference, the W- value shows there is a significant ifference between the two algorithms. Further, the performance of HCA CACO-DE, GEM, WCA, an ER-WCA is very competitive. Overall, the results also inicate that HCA is highly efficient for function optimization. HCA, MHCA, Continuous GA (CGA), an Enhance Continuous Tabu Search (ECTS) were also compare on some other functions in terms of average number of iterations require to reach an optimal solution. The results for CGA anectsweretakenfrom[16].thepaperreporteonly the average number of iterations an the success rate of their algorithms. The success rate was 1% for all the algorithms. The results are liste in Table 9, where numbers in bolface inicate the lowest value. FromTable9,itcanbenotethatbothHCAanMHCA achieve optimal solutions in fewer iterations than the CGA. This is confirme using Wilcoxon test an T-test on the obtaine results; see Table 1. Despite there being no significant ifference between the results of HCA, MHCA, an ECTS, the means of HCA an MHCA results are less than ECTS, inicating competitive performance.

18 18 Journal of Optimization Table 6: Comparison between WCA an HCA in terms of results an iterations. Function name D Bon Bestresult WCA HCA Worst Avg Best # Worst Avg Best # De Jong 2 ± Golstein & Price I 2 ± E + 3.E + 3.E Branin 2 ± E E E Martin & Gay 2 [, 1] 9.29E E 4 5.1E 6 57.E +.E +.E Rosenbrock 2 ± E E E E E E Rosenbrock 2 ±1 9.86E E E E E 1.E Rosenbrock 4 ± E E E E 1 7.2E E Hypersphere 6 ± E 3 6.1E E E E E Shaffer 2 ±1 9.71E E E E +.E +.E Golstein & Price I 2 ± E + 3.E + 3.E Golstein & Price II 2 ± ,5 1.E + 1.E + 1.E Six-Hump Came back 2 ± E + 1.3E + 1.3E Easton & Fenton 2 [, 1] E E E Woo 4 ±5 3.81E E 6 1.3E 1 15, E E E Powell quartic 4 ±5 2.87E 9 6.9E E 11 23,5 2.42E E E Mean

19 Journal of Optimization 19 Table 7: Comparison between HCA an other algorithms in terms of average number of iterations. Function name D B CACO-DE ANTS GA BA GEM WCA ER-WCA HCA (1) De Jong 2 ± , (2) Golstein & Price I 2 ± (3) Branin 2 ± (4) Martin & Gay 2 [, 1] (5a) Rosenbrock 2 ± , (5b) Rosenbrock 2 ± (6) Rosenbrock 4 ± ,529 82, (7) Hypersphere 6 ± ,5 15, (8) Shaffer Mean

20 2 Journal of Optimization Table 8: Comparisonbetween P/W-values for HCA an other algorithms (base on Table 7). Using T-test Using Wilcoxon test Istheresultsignificant Algorithm versus HCA W-value at P value Z-value (at critical value of w) P.5 CACO-DE versus HCA (at) No ANTS versus HCA (at3) Yes GA versus HCA (at) Yes BA versus HCA (at3) Yes GEMversusHCA (at3) No WCA versus HCA (at5) No ER-WCA versus HCA (at 5) No Table9:ComparisonofCGA,ECTS,HCA,anMHCA. Number Function name D CGA ECTS HCA MHCA (1) Shubert (2) Bohachevsky (3) Zakharov (4) Rosenbrock (5) Easom (6) Branin (7) Golstein-Price (8) Hartmann (9) Sphere Mean Table 1: Comparison between P/W-values for CGA,ECTS,HCA,an MHCA. Using T-test Using Wilcoxon test Algorithm versus HCA Is the result significant at W-value P value Z-value P.5 (at critical value of w) CGA versus HCA (at5) Yes CGA versus MHCA (at5) Yes ECTS versus HCA (at 2) No ECTS versus MHCA (at2) No HCA versus MHCA (at 5) No Figure 9 illustrates the convergence of global an local solutions for the Ackley function accoring to the number of iterations. The re line Local represents the quality of the solutionthatwasobtaineattheenofeachiteration.this shows that the algorithm was able to avoi stagnation an keep generating ifferent solutions in each iteration reflecting the proper esign of the flow stage. We postulate that such solution iversity was maintaine by the epth factor an fluctuations of the water rop velocities. The HCA minimize the Ackley function after 383 iterations. Figure 1 shows a 2D graph for Easom function. The function has two variables, an the global minimum value is equal to 1 when (X 1 =π, X 2 =π). Solving this function is ifficult as the flatness of the lanscape oes not give any inication of the irection towars global minimum. Figure 11 shows the convergence of the HCA solving Easom function. As can be seen, the HCA kept generating ifferent solutions on each iteration while converging towars the global minimum value. Figure 12 shows the convergence of thehcasolvingrosenbrockfunction. Figure 13 illustrates the convergence spee for HCA on the Hypersphere function with six variables. The HCA minimize the function after 919 iterations. As shown in Figure 13, the HCA require more iterations to minimize functions with more than two variables because oftheincreaseinthesolutionsearchspace. 5. Conclusion an Future Work In this paper, a new nature-inspire algorithm calle the Hyrological Cycle Algorithm (HCA) is presente. In HCA, a collection of water rops pass through ifferent phases such as flow (runoff), evaporation, conensation, an precipitation to generate a solution. The HCA has been shown to solve continuous optimization problems an provie highquality solutions. In aition, its ability to escape local optima an fin the global optima has been emonstrate,

21 Journal of Optimization 21 Function value 18 Ackley function, convergence at Global best Local best Number of iterations Figure 9: The global an local convergence of the HCA versus iterations number. Function value Easom function, convergence at Number of iterations Global best Local best Figure 11: The global an local convergence of the HCA on Easom function. f(x1, x2) x Figure 1: Easom function graph (from [4]). which valiates the convergence an the effectiveness of the exploration an exploitation process of the algorithm. One of the reasons for this improve performance is that both irect an inirect communication take place to share information among the water rops. The propose representation of the continuous problems can easily be aapte to other problems. For instance, approaches involving gravity can regar the representation as a topology, with swarm particles starting at the highest point. Approaches involving placing weights on graph vertices can regar the representation as a form of optimize route fining. The results of the benchmarking experiments show that HCA provies results in some cases better an in others not significantly worse than other optimization algorithms. But there is room for further improvement. Further work is require to optimize the algorithm s performance when ealing with problems involving two or more variables. In terms of solution accuracy, the algorithm implementation an the problem representation coul be improve, incluing ynamic fine-tuning of the parameters. Possible further improvements to the HCA coul inclue other techniques to ynamically control evaporation an conensation. Stuying x1 1 2 Function value Rosenbrock function, convergence at Number of iterations Global best Local best Figure 12: The global an local convergence of the HCA on Rosenbrock function. the impact of the number of water rops on the quality of the solution is also worth investigating. In conclusion, if a natural computing approach is to be consiere a novel aition to the alreay large fiel of overlapping methos an techniques, it nees to embe the two critical concepts of self-organization an emergentism at its core, with intuitively base (i.e., nature-base) information sharing mechanisms for effective exploration an exploitation, as well as emonstrate performance which is at least on par with relate algorithms in the fiel. In particular, it shoul be shown to offer something a bit ifferent from what is available alreay. We conten that HCA satisfies these criteria an, while further evelopment is require to test its full potential, can be use by researchers ealing with continuous optimization problems with confience. Appenix Table 11 shows the mathematical formulations for the use test functions, which have been taken from [4, 24].

22 22 Journal of Optimization Table 11: The mathematical formulations. Function name Mathematical formulations Ackley f (x) = aexp ( b 1 x 2 i ) exp ( 1 cos (cx i )) + a + exp (1) x1 2 Cross-In-Tray f (x) =.1 ( sin (x 1 ) sin (x 2 ) exp ( 1 +x2 2 ) +1) π Drop-Wave Gramacy & Lee (212) 1+cos (12 x1 2 +x2 2 ) f (x) =.5 (x1 2 +x2 2 )+2 f (x) = sin (1πx) 2x + (x 1) 4 x 2 i Griewank f (x) = 4 cos ( x i i )+1 x1 2 Holer Table f (x) = sin(x 1 ) cos (x 2 ) exp ( 1 +x2 2 ) π Levy 1 f (x) = sin 2 (3πw 1 )+.1 a = 2, b =.2 an c=2π (w i 1) 2 [1 + 1 sin 2 (πw i +1)]+(w 1) 2 [1 + sin 2 (2πw )] where, w i =1+ x i 1,,..., 4 Levy N. 13 f(x) = sin 2 (3πx 1 )+(x 1 1) 2 [1 + sin 2 (3πx 2 )] + (x 2 1) 2 [1 + sin 2 (3πx 2 )] Rastrigin f (x) = 1 + [x 2 i 1cos (2πx i )] Schaffer N. 2 f (x) =.5 + sin2 (x 2 1 +x2 2 ).5 [1 +.1 (x 2 1 +x2 2 )]2 Schaffer N. 4 Schwefel f (x) =.5 + cos (sin ( x2 1 x2 2 )).5 [1 +.1 (x1 2 +x2 2 )]2 f (x) = x i sin ( x i ) Shubert f (x) =( i cos ((i+1) x 1 +i))( 5 5 i cos ((i+1) x 2 +i)) Bohachevsky f (x) =x x2 2.3 cos (3πx 1).4 cos (4πx 2 ) +.7 Rotate Hyper-Ellipsoi i x 2 j j=1 f (x) = Sphere (Hyper) f (x) = Sum Squares f (x) = x 2 i ix 2 i Booth f (x) =(x 1 +2x 2 7) 2 (2x 1 +x 2 5) 2 Matyas f (x) =.26 (x 2 1 +x2 2 ).48x 1x 2 McCormick f (x) = sin (x 1 +x 2 )+(x 1 +x 2 ) 2 1.5x x 2 +1 Zakharov f (x) = x 2 i +(.5ix i ) 2 +( Three-Hump Camel f (x) =2x x4 1 + x x 1x 2 +x 2 2.5ix i ) 4

23 Journal of Optimization 23 Table 11: Continue. Function name Mathematical formulations Six-Hump Camel f (x) = (4 2.1x x4 1 3 )x2 1 +x 1x 2 +( 4+4x 2 2 )x2 2 Dixon-Price f (x) =(x 1 1) 2 + i(2x 2 i x i 1 ) 2 1 Rosenbrock f (x) = Shekel s Foxholes (De Jong N. 5) [1 (x i+1 x 2 i )2 +(x i 1) 2 ] 25 1 f (x) = (.2 + i+(x 1 a 1i ) 6 +(x 2 a 2i ) ), where a=( ) Easom f (x) = cos (x 1 ) cos (x 2 ) exp ( (x 1 π) 2 (x 2 π) 2 ) 1 Michalewicz f (x) = sin (x i ) sin 2m ( ix2 i π ), where m=1 Beale f (x) = (1.5 x 1 +x 1 x 2 ) 2 + (2.25 x 1 +x 1 x 2 2 )2 + (2.625 x 1 +x 1 x 3 2 )2 Branin f (x) =a(x 2 bx 2 1 +cx 1 r) 2 +s(1 t) cos (x 1 )+s a=1, b= 5.1 (4π 2 ), c = 5 1, r = 6, s = 1, an t= π 8π Golstein-Price I f (x) =[1+(x 1 +x 2 +1) 2 (19 14x 1 +3x x 2 +6x 1 x 2 +3x 2 2 )] [3 + (2x 1 3x 2 ) 2 (18 32x x x 2 36x 1 x x 2 2 )] Golstein-Price II f (x) = exp { (x2 1 +x2 2 25)} + sin 4 (4x 1 3x 2 )+ 1 2 (2x 1 +x 2 1) 2 Styblinski-Tang f (x) = 1 (x 4 i 16x 2 i +5x 2 i ) Gramacy & Lee (28) f (x) =x 1 exp ( x 2 1 x2 2 ) Martin & Gay f (x) =(x 1 x 2 ) 2 +( (x 1 +x 2 1) ) 3 Easton an Fenton f (x) =(12+x x2 2 x 2 1 Hartmann 3-D f (x) = Powell f (x) = Woo 4 α i exp ( 3 j=1 + x2 1 x (x 1 x 2 ) 4 ).( 1 1 ) A ij (x j p ij ) 2 ), where α=(1., 1.2, 3., 3.2) T, A = ( ), P = 1 4 ( ) /4 ( ) ( ) [(x 4i 3 + 1x 4i 2 ) 2 +5(x 4i 1 x 4i ) 2 +(x 4i 2 2x 4i 1 ) 4 +1(x 4i 3 x 4i ) 4 ] f(x 1,x 2,x 3,x 4 ) = 1 (x 1 x 2 ) 2 +(x 1 1) 2 +(x 3 1) 2 +9(x 2 3 x 4) ((x 2 1) 2 + (x 4 1) 2 ) (x 2 1)(x 4 1) DeJong max f (x) = (395.93) 1 (x 2 1 x 2) 2 (1 x 1 ) 2

24 24 Journal of Optimization Function value Sphere (6v) function, convergence at Global-best solution Iteration number Figure 13: Convergence of HCA on the Hypersphere function with six variables. Conflicts of Interest The authors eclare that they have no conflicts of interest. References [1] F. Rothlauf, Optimization problems, in Design of Moern Heuristics: Principles an Application, pp. 7 44, Springer, Berlin, Germany, 211. [2] E.K.P.ChonganS.H.Zak,An Introuction to Optimization, vol. 76, John & Wiley Sons, 213. [3] M. Jamil an X.-S. Yang, A literature survey of benchmark functions for global optimisation problems, International Journal of Mathematical Moelling an Numerical Optimisation,vol. 4, no. 2, pp , 213. [4] S. Surjanovic an D. Bingham, Virtual library of simulation experiments: test functions an atasets, Simon Fraser University, 213, ssurjano/inex.html. [5] D.H.WolpertanW.G.Macreay, Nofreelunchtheoremsfor optimization, IEEE Transactions on Evolutionary Computation, vol. 1, no. 1, pp , [6]M.Mathur,S.B.Karale,S.Priye,V.K.Jayaraman,anB. D. Kulkarni, Ant colony approach to continuous function optimization, Inustrial & Engineering Chemistry Research,vol. 39, no. 1, pp , 2. [7] K. Socha an M. Dorigo, Ant colony optimization for continuous omains, European Journal of Operational Research, vol. 185, no. 3, pp , 28. [8] G. Bilchev an I. C. Parmee, The ant colony metaphor for searching continuous esign spaces, Lecture Notes in Computer Science (incluing subseries Lecture Notes in Artificial Intelligence an Lecture Notes in Bioinformatics): Preface,vol.993,pp.25 39, [9] N. Monmarché, G. Venturini, an M. Slimane, On how Pachyconyla apicalis ants suggest a new search algorithm, Future Generation Computer Systems,vol.16,no.8,pp , 2. [1] J. Dréo an P. Siarry, Continuous interacting ant colony algorithm base on ense heterarchy, Future Generation Computer Systems,vol.2,no.5,pp ,24. [11] L. Liu, Y. Dai, an J. Gao, Ant colony optimization algorithm for continuous omains base on position istribution moel of ant colony foraging, The Scientific Worl Journal, vol. 214, ArticleID428539,9pages,214. [12]V.K.Ojha,A.Abraham,anV.Snášel, ACO for continuous function optimization: A performance analysis, in Proceeings of the th International Conference on Intelligent Systems Design an Applications, ISDA 214, pp , Japan, November 214. [13] J.H.Hollan,Aaptation in Natural an Artificial Systems, MIT Press, Cambrige, Mass, USA, [14] M. Mitchell,An Introuction to Genetic Algorithms, MIT Press, Cambrige, MA, USA, [15] H. Maaranen, K. Miettinen, an A. Penttinen, On initial populations of a genetic algorithm for continuous optimization problems, JournalofGlobalOptimization, vol. 37, no. 3, pp , 27. [16] R. Chelouah an P. Siarry, Continuous genetic algorithm esigne for the global optimization of multimoal functions, Journal of Heuristics,vol.6,no.2,pp ,2. [17] Y.-T. Kao an E. Zahara, A hybri genetic algorithm an particle swarm optimization for multimoal functions, Applie Soft Computing,vol.8,no.2,pp ,28. [18] K. James an E. Russell, Particle swarm optimization, in Proceeings of the 1995 IEEE International Conference on Neural Networks, pp , IEEE, Perth, WA, Australia, [19] W. Chen, J. Zhang, Y. Lin et al., Particle swarm optimization with an aging leaer an challengers, IEEE Transactions on Evolutionary Computation,vol.17,no.2,pp ,213. [2] J. F. Schutte an A. A. Groenwol, A stuy of global optimization using particle swarms, Journal of Global Optimization,vol. 31,no.1,pp.93 18,25. [21] P. S. Shelokar, P. Siarry, V. K. Jayaraman, an B. D. Kulkarni, Particle swarm an ant colony algorithms hybriize for improve continuous optimization, Applie Mathematics an Computation, vol. 188, no. 1, pp , 27. [22] J. Vesterstrom an R. Thomsen, A comparative stuy of ifferential evolution, particle swarm optimization, an evolutionary algorithms on numerical benchmark problems, in Proceeings of the 24 Congress on Evolutionary Computation (IEEE Cat. No.4TH8753), vol. 2, pp , IEEE, Portlan, OR, USA, 24. [23] S.C.EsquivelanC.A.C.Coello, Ontheuseofparticleswarm optimization with multimoal functions, in Proceeings of the Congress on Evolutionary Computation (CEC 3), pp , IEEE, Canberra, Australia, December 23. [24] D. T. Pham, A. Ghanbarzaeh, E. Koc, S. Otri, S. Rahim, an M. Zaii, The bees algorithm a novel tool for complex optimisation, in Proceeings of the Intelligent Prouction Machines an Systems-2n I* PROMS Virtual International Conference, Elsevier, July 26. [25] A. Ahrari an A. A. Atai, Grenae Explosion Metho a novel tool for optimization of multimoal functions, Applie Soft Computing,vol.1,no.4,pp ,21. [26] H. Shah-Hosseini, Problem solving by intelligent water rops, in Proceeings of the 27 IEEE Congress on Evolutionary Computation, CEC 27, pp , sgp, September 27.

25 Journal of Optimization 25 [27] H. Shah-Hosseini, An approach to continuous optimization by the intelligent water rops algorithm, in Proceeings of the 4th International Conference of Cognitive Science, ICCS 211,pp , Iran, May 211. [28] H. Eskanar, A. Saollah, A. Bahreinineja, an M. Hami, Water cycle algorithm - A novel metaheuristic optimization metho for solving constraine engineering optimization problems, Computers & Structures, vol , pp , 212. [29] A. Saollah, H. Eskanar, A. Bahreinineja, an J. H. Kim, Water cycle algorithm with evaporation rate for solving constraine an unconstraine optimization problems, Applie Soft Computing,vol.3,pp.58 71,215. [3] Q. Luo, C. Wen, S. Qiao, an Y. Zhou, Dual-system water cycle algorithm for constraine engineering optimization problems, in Proceeings of the Intelligent Computing Theories an Application: 12th International Conference, ICIC 216, D.-S. Huang, V. Bevilacqua, an P. Premaratne, Es., pp , Springer International Publishing, Lanzhou, China, August 216. [31] A. A. Heiari, R. Ali Abbaspour, an A. Rezaee Jorehi, An efficient chaotic water cycle algorithm for optimization tasks, Neural Computing an Applications, vol.28,no.1,pp.57 85, 217. [32] M. M. al-rifaie, J. M. Bishop, an T. Blackwell, Information sharing impact of stochastic iffusion search on ifferential evolution algorithm, Memetic Computing, vol. 4, no. 4, pp , 212. [33] T. Hogg an C. P. Williams, Solving the really har problems with cooperative search, in Proceeings of the National Conference on Artificial Intelligence,p.231,1993. [34] C.Ding,L.Lu,Y.Liu,anW.Peng, Swarmintelligenceoptimization an its applications, in Proceeings of the Avance Research on Electronic Commerce, Web Application, an Communication: International Conference, ECWAC 211, G.Shen an X. Huang, Es., pp , Springer, Guangzhou, China, April 211. [35] L. Goel an V. K. Panchal, Feature extraction through information sharing in swarm intelligence techniques, Knowlege- Base Processes in Software Development, pp , 213. [36] Y. Li, Z.-H. Zhan, S. Lin, J. Zhang, an X. Luo, Competitive an cooperative particle swarm optimization with information sharing mechanism for global optimization problems, Information Sciences,vol.293,pp ,215. [37] M. Mavrovouniotis an S. Yang, Ant colony optimization with irect communication for the traveling salesman problem, in Proceeings of the 21 UK Workshop on Computational Intelligence, UKCI 21,UK,September21. [38] T. Davie, Funamentals of Hyrology,Taylor&Francis,28. [39] J. Southar, An Introuction to Flui Motions, Seiment Transport, an Current-Generate Seimentary Structures [Ph. D. thesis], Massachusetts Institute of Technology, Cambrige, MA, USA, 26. [4] B. R. Colby, Effect of Depth of Flow on Discharge of Be Material, US Geological Survey, [41] M. Bishop an G. H. Locket, Introuction to Chemistry, Benjamin Cummings, San Francisco, Calif, USA, 22. [42] J. M. Wallace an P. V. Hobbs, Atmospheric Science: An Introuctory Survey, vol. 92, Acaemic Press, 26. [43] R. Hill, A First Course in Coing Theory,ClarenonPress,1986. [44] H. Huang an Z. Hao, ACO for continuous optimization base on iscrete encoing, in Proceeings of the International Workshop on Ant Colony Optimization an Swarm Intelligence, pp.54-55,26.

26 Avances in Operations Research Avances in Decision Sciences Journal of Applie Mathematics Algebra Journal of Probability an Statistics The Scientific Worl Journal International Journal of Differential Equations Submit your manuscripts at International Journal of Avances in Combinatorics Mathematical Physics Journal of Complex Analysis International Journal of Mathematics an Mathematical Sciences Mathematical Problems in Engineering Journal of Mathematics Journal of Volume 21 Discrete Dynamics in Nature an Society Journal of Function Spaces Abstract an Applie Analysis International Journal of Journal of Stochastic Analysis Optimization

Particle Swarm Optimization Based on Smoothing Approach for Solving a Class of Bi-Level Multiobjective Programming Problem

Particle Swarm Optimization Based on Smoothing Approach for Solving a Class of Bi-Level Multiobjective Programming Problem BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 17, No 3 Sofia 017 Print ISSN: 1311-970; Online ISSN: 1314-4081 DOI: 10.1515/cait-017-0030 Particle Swarm Optimization Base

More information

A New Search Algorithm for Solving Symmetric Traveling Salesman Problem Based on Gravity

A New Search Algorithm for Solving Symmetric Traveling Salesman Problem Based on Gravity Worl Applie Sciences Journal 16 (10): 1387-1392, 2012 ISSN 1818-4952 IDOSI Publications, 2012 A New Search Algorithm for Solving Symmetric Traveling Salesman Problem Base on Gravity Aliasghar Rahmani Hosseinabai,

More information

Transient analysis of wave propagation in 3D soil by using the scaled boundary finite element method

Transient analysis of wave propagation in 3D soil by using the scaled boundary finite element method Southern Cross University epublications@scu 23r Australasian Conference on the Mechanics of Structures an Materials 214 Transient analysis of wave propagation in 3D soil by using the scale bounary finite

More information

Coupling the User Interfaces of a Multiuser Program

Coupling the User Interfaces of a Multiuser Program Coupling the User Interfaces of a Multiuser Program PRASUN DEWAN University of North Carolina at Chapel Hill RAJIV CHOUDHARY Intel Corporation We have evelope a new moel for coupling the user-interfaces

More information

Fast Fractal Image Compression using PSO Based Optimization Techniques

Fast Fractal Image Compression using PSO Based Optimization Techniques Fast Fractal Compression using PSO Base Optimization Techniques A.Krishnamoorthy Visiting faculty Department Of ECE University College of Engineering panruti rishpci89@gmail.com S.Buvaneswari Visiting

More information

APPLYING GENETIC ALGORITHM IN QUERY IMPROVEMENT PROBLEM. Abdelmgeid A. Aly

APPLYING GENETIC ALGORITHM IN QUERY IMPROVEMENT PROBLEM. Abdelmgeid A. Aly International Journal "Information Technologies an Knowlege" Vol. / 2007 309 [Project MINERVAEUROPE] Project MINERVAEUROPE: Ministerial Network for Valorising Activities in igitalisation -

More information

Online Appendix to: Generalizing Database Forensics

Online Appendix to: Generalizing Database Forensics Online Appenix to: Generalizing Database Forensics KYRIACOS E. PAVLOU an RICHARD T. SNODGRASS, University of Arizona This appenix presents a step-by-step iscussion of the forensic analysis protocol that

More information

A shortest path algorithm in multimodal networks: a case study with time varying costs

A shortest path algorithm in multimodal networks: a case study with time varying costs A shortest path algorithm in multimoal networks: a case stuy with time varying costs Daniela Ambrosino*, Anna Sciomachen* * Department of Economics an Quantitative Methos (DIEM), University of Genoa Via

More information

Non-homogeneous Generalization in Privacy Preserving Data Publishing

Non-homogeneous Generalization in Privacy Preserving Data Publishing Non-homogeneous Generalization in Privacy Preserving Data Publishing W. K. Wong, Nios Mamoulis an Davi W. Cheung Department of Computer Science, The University of Hong Kong Pofulam Roa, Hong Kong {wwong2,nios,cheung}@cs.hu.h

More information

Solution Representation for Job Shop Scheduling Problems in Ant Colony Optimisation

Solution Representation for Job Shop Scheduling Problems in Ant Colony Optimisation Solution Representation for Job Shop Scheuling Problems in Ant Colony Optimisation James Montgomery, Carole Faya 2, an Sana Petrovic 2 Faculty of Information & Communication Technologies, Swinburne University

More information

Study of Network Optimization Method Based on ACL

Study of Network Optimization Method Based on ACL Available online at www.scienceirect.com Proceia Engineering 5 (20) 3959 3963 Avance in Control Engineering an Information Science Stuy of Network Optimization Metho Base on ACL Liu Zhian * Department

More information

Classifying Facial Expression with Radial Basis Function Networks, using Gradient Descent and K-means

Classifying Facial Expression with Radial Basis Function Networks, using Gradient Descent and K-means Classifying Facial Expression with Raial Basis Function Networks, using Graient Descent an K-means Neil Allrin Department of Computer Science University of California, San Diego La Jolla, CA 9237 nallrin@cs.ucs.eu

More information

Evolutionary Optimisation Methods for Template Based Image Registration

Evolutionary Optimisation Methods for Template Based Image Registration Evolutionary Optimisation Methos for Template Base Image Registration Lukasz A Machowski, Tshilizi Marwala School of Electrical an Information Engineering University of Witwatersran, Johannesburg, South

More information

Selection Strategies for Initial Positions and Initial Velocities in Multi-optima Particle Swarms

Selection Strategies for Initial Positions and Initial Velocities in Multi-optima Particle Swarms ACM, 2011. This is the author s version of the work. It is poste here by permission of ACM for your personal use. Not for reistribution. The efinitive version was publishe in Proceeings of the 13th Annual

More information

Skyline Community Search in Multi-valued Networks

Skyline Community Search in Multi-valued Networks Syline Community Search in Multi-value Networs Rong-Hua Li Beijing Institute of Technology Beijing, China lironghuascut@gmail.com Jeffrey Xu Yu Chinese University of Hong Kong Hong Kong, China yu@se.cuh.eu.h

More information

Research Article Inviscid Uniform Shear Flow past a Smooth Concave Body

Research Article Inviscid Uniform Shear Flow past a Smooth Concave Body International Engineering Mathematics Volume 04, Article ID 46593, 7 pages http://x.oi.org/0.55/04/46593 Research Article Invisci Uniform Shear Flow past a Smooth Concave Boy Abullah Mura Department of

More information

A PSO Optimized Layered Approach for Parametric Clustering on Weather Dataset

A PSO Optimized Layered Approach for Parametric Clustering on Weather Dataset Vol.3, Issue.1, Jan-Feb. 013 pp-504-508 ISSN: 49-6645 A PSO Optimize Layere Approach for Parametric Clustering on Weather Dataset Shikha Verma, 1 Kiran Jyoti 1 Stuent, Guru Nanak Dev Engineering College

More information

Lecture 1 September 4, 2013

Lecture 1 September 4, 2013 CS 84r: Incentives an Information in Networks Fall 013 Prof. Yaron Singer Lecture 1 September 4, 013 Scribe: Bo Waggoner 1 Overview In this course we will try to evelop a mathematical unerstaning for the

More information

An Algorithm for Building an Enterprise Network Topology Using Widespread Data Sources

An Algorithm for Building an Enterprise Network Topology Using Widespread Data Sources An Algorithm for Builing an Enterprise Network Topology Using Wiesprea Data Sources Anton Anreev, Iurii Bogoiavlenskii Petrozavosk State University Petrozavosk, Russia {anreev, ybgv}@cs.petrsu.ru Abstract

More information

Interior Permanent Magnet Synchronous Motor (IPMSM) Adaptive Genetic Parameter Estimation

Interior Permanent Magnet Synchronous Motor (IPMSM) Adaptive Genetic Parameter Estimation Interior Permanent Magnet Synchronous Motor (IPMSM) Aaptive Genetic Parameter Estimation Java Rezaie, Mehi Gholami, Reza Firouzi, Tohi Alizaeh, Karim Salashoor Abstract - Interior permanent magnet synchronous

More information

Generalized Edge Coloring for Channel Assignment in Wireless Networks

Generalized Edge Coloring for Channel Assignment in Wireless Networks TR-IIS-05-021 Generalize Ege Coloring for Channel Assignment in Wireless Networks Chun-Chen Hsu, Pangfeng Liu, Da-Wei Wang, Jan-Jan Wu December 2005 Technical Report No. TR-IIS-05-021 http://www.iis.sinica.eu.tw/lib/techreport/tr2005/tr05.html

More information

Image Segmentation using K-means clustering and Thresholding

Image Segmentation using K-means clustering and Thresholding Image Segmentation using Kmeans clustering an Thresholing Preeti Panwar 1, Girhar Gopal 2, Rakesh Kumar 3 1M.Tech Stuent, Department of Computer Science & Applications, Kurukshetra University, Kurukshetra,

More information

Almost Disjunct Codes in Large Scale Multihop Wireless Network Media Access Control

Almost Disjunct Codes in Large Scale Multihop Wireless Network Media Access Control Almost Disjunct Coes in Large Scale Multihop Wireless Network Meia Access Control D. Charles Engelhart Anan Sivasubramaniam Penn. State University University Park PA 682 engelhar,anan @cse.psu.eu Abstract

More information

Using Vector and Raster-Based Techniques in Categorical Map Generalization

Using Vector and Raster-Based Techniques in Categorical Map Generalization Thir ICA Workshop on Progress in Automate Map Generalization, Ottawa, 12-14 August 1999 1 Using Vector an Raster-Base Techniques in Categorical Map Generalization Beat Peter an Robert Weibel Department

More information

Message Transport With The User Datagram Protocol

Message Transport With The User Datagram Protocol Message Transport With The User Datagram Protocol User Datagram Protocol (UDP) Use During startup For VoIP an some vieo applications Accounts for less than 10% of Internet traffic Blocke by some ISPs Computer

More information

Queueing Model and Optimization of Packet Dropping in Real-Time Wireless Sensor Networks

Queueing Model and Optimization of Packet Dropping in Real-Time Wireless Sensor Networks Queueing Moel an Optimization of Packet Dropping in Real-Time Wireless Sensor Networks Marc Aoun, Antonios Argyriou, Philips Research, Einhoven, 66AE, The Netherlans Department of Computer an Communication

More information

Comparison of Methods for Increasing the Performance of a DUA Computation

Comparison of Methods for Increasing the Performance of a DUA Computation Comparison of Methos for Increasing the Performance of a DUA Computation Michael Behrisch, Daniel Krajzewicz, Peter Wagner an Yun-Pang Wang Institute of Transportation Systems, German Aerospace Center,

More information

Modifying ROC Curves to Incorporate Predicted Probabilities

Modifying ROC Curves to Incorporate Predicted Probabilities Moifying ROC Curves to Incorporate Preicte Probabilities Cèsar Ferri DSIC, Universitat Politècnica e València Peter Flach Department of Computer Science, University of Bristol José Hernánez-Orallo DSIC,

More information

1 Surprises in high dimensions

1 Surprises in high dimensions 1 Surprises in high imensions Our intuition about space is base on two an three imensions an can often be misleaing in high imensions. It is instructive to analyze the shape an properties of some basic

More information

Yet Another Parallel Hypothesis Search for Inverse Entailment Hiroyuki Nishiyama and Hayato Ohwada Faculty of Sci. and Tech. Tokyo University of Scien

Yet Another Parallel Hypothesis Search for Inverse Entailment Hiroyuki Nishiyama and Hayato Ohwada Faculty of Sci. and Tech. Tokyo University of Scien Yet Another Parallel Hypothesis Search for Inverse Entailment Hiroyuki Nishiyama an Hayato Ohwaa Faculty of Sci. an Tech. Tokyo University of Science, 2641 Yamazaki, Noa-shi, CHIBA, 278-8510, Japan hiroyuki@rs.noa.tus.ac.jp,

More information

6 Gradient Descent. 6.1 Functions

6 Gradient Descent. 6.1 Functions 6 Graient Descent In this topic we will iscuss optimizing over general functions f. Typically the function is efine f : R! R; that is its omain is multi-imensional (in this case -imensional) an output

More information

Divide-and-Conquer Algorithms

Divide-and-Conquer Algorithms Supplment to A Practical Guie to Data Structures an Algorithms Using Java Divie-an-Conquer Algorithms Sally A Golman an Kenneth J Golman Hanout Divie-an-conquer algorithms use the following three phases:

More information

Learning Polynomial Functions. by Feature Construction

Learning Polynomial Functions. by Feature Construction I Proceeings of the Eighth International Workshop on Machine Learning Chicago, Illinois, June 27-29 1991 Learning Polynomial Functions by Feature Construction Richar S. Sutton GTE Laboratories Incorporate

More information

A Revised Simplex Search Procedure for Stochastic Simulation Response Surface Optimization

A Revised Simplex Search Procedure for Stochastic Simulation Response Surface Optimization 272 INFORMS Journal on Computing 0899-1499 100 1204-0272 $05.00 Vol. 12, No. 4, Fall 2000 2000 INFORMS A Revise Simplex Search Proceure for Stochastic Simulation Response Surface Optimization DAVID G.

More information

Cluster Center Initialization Method for K-means Algorithm Over Data Sets with Two Clusters

Cluster Center Initialization Method for K-means Algorithm Over Data Sets with Two Clusters Available online at www.scienceirect.com Proceia Engineering 4 (011 ) 34 38 011 International Conference on Avances in Engineering Cluster Center Initialization Metho for K-means Algorithm Over Data Sets

More information

On the Role of Multiply Sectioned Bayesian Networks to Cooperative Multiagent Systems

On the Role of Multiply Sectioned Bayesian Networks to Cooperative Multiagent Systems On the Role of Multiply Sectione Bayesian Networks to Cooperative Multiagent Systems Y. Xiang University of Guelph, Canaa, yxiang@cis.uoguelph.ca V. Lesser University of Massachusetts at Amherst, USA,

More information

Short-term prediction of photovoltaic power based on GWPA - BP neural network model

Short-term prediction of photovoltaic power based on GWPA - BP neural network model Short-term preiction of photovoltaic power base on GWPA - BP neural networ moel Jian Di an Shanshan Meng School of orth China Electric Power University, Baoing. China Abstract In recent years, ue to China's

More information

Clustering using Particle Swarm Optimization. Nuria Gómez Blas, Octavio López Tolic

Clustering using Particle Swarm Optimization. Nuria Gómez Blas, Octavio López Tolic 24 International Journal Information Theories an Applications, Vol. 23, Number 1, (c) 2016 Clustering using Particle Swarm Optimization Nuria Gómez Blas, Octavio López Tolic Abstract: Data clustering has

More information

Offloading Cellular Traffic through Opportunistic Communications: Analysis and Optimization

Offloading Cellular Traffic through Opportunistic Communications: Analysis and Optimization 1 Offloaing Cellular Traffic through Opportunistic Communications: Analysis an Optimization Vincenzo Sciancalepore, Domenico Giustiniano, Albert Banchs, Anreea Picu arxiv:1405.3548v1 [cs.ni] 14 May 24

More information

A Plane Tracker for AEC-automation Applications

A Plane Tracker for AEC-automation Applications A Plane Tracker for AEC-automation Applications Chen Feng *, an Vineet R. Kamat Department of Civil an Environmental Engineering, University of Michigan, Ann Arbor, USA * Corresponing author (cforrest@umich.eu)

More information

EDOVE: Energy and Depth Variance-Based Opportunistic Void Avoidance Scheme for Underwater Acoustic Sensor Networks

EDOVE: Energy and Depth Variance-Based Opportunistic Void Avoidance Scheme for Underwater Acoustic Sensor Networks sensors Article EDOVE: Energy an Depth Variance-Base Opportunistic Voi Avoiance Scheme for Unerwater Acoustic Sensor Networks Safar Hussain Bouk 1, *, Sye Hassan Ahme 2, Kyung-Joon Park 1 an Yongsoon Eun

More information

WLAN Indoor Positioning Based on Euclidean Distances and Fuzzy Logic

WLAN Indoor Positioning Based on Euclidean Distances and Fuzzy Logic WLAN Inoor Positioning Base on Eucliean Distances an Fuzzy Logic Anreas TEUBER, Bern EISSFELLER Institute of Geoesy an Navigation, University FAF, Munich, Germany, e-mail: (anreas.teuber, bern.eissfeller)@unibw.e

More information

Bends, Jogs, And Wiggles for Railroad Tracks and Vehicle Guide Ways

Bends, Jogs, And Wiggles for Railroad Tracks and Vehicle Guide Ways Ben, Jogs, An Wiggles for Railroa Tracks an Vehicle Guie Ways Louis T. Klauer Jr., PhD, PE. Work Soft 833 Galer Dr. Newtown Square, PA 19073 lklauer@wsof.com Preprint, June 4, 00 Copyright 00 by Louis

More information

Chapter 14 Global Search Algorithms

Chapter 14 Global Search Algorithms Chapter 14 Global Search Algorithms An Introduction to Optimization Spring, 2015 Wei-Ta Chu 1 Introduction We discuss various search methods that attempts to search throughout the entire feasible set.

More information

On Effectively Determining the Downlink-to-uplink Sub-frame Width Ratio for Mobile WiMAX Networks Using Spline Extrapolation

On Effectively Determining the Downlink-to-uplink Sub-frame Width Ratio for Mobile WiMAX Networks Using Spline Extrapolation On Effectively Determining the Downlink-to-uplink Sub-frame With Ratio for Mobile WiMAX Networks Using Spline Extrapolation Panagiotis Sarigianniis, Member, IEEE, Member Malamati Louta, Member, IEEE, Member

More information

Computer Organization

Computer Organization Computer Organization Douglas Comer Computer Science Department Purue University 250 N. University Street West Lafayette, IN 47907-2066 http://www.cs.purue.eu/people/comer Copyright 2006. All rights reserve.

More information

Robust PIM-SM Multicasting using Anycast RP in Wireless Ad Hoc Networks

Robust PIM-SM Multicasting using Anycast RP in Wireless Ad Hoc Networks Robust PIM-SM Multicasting using Anycast RP in Wireless A Hoc Networks Jaewon Kang, John Sucec, Vikram Kaul, Sunil Samtani an Mariusz A. Fecko Applie Research, Telcoria Technologies One Telcoria Drive,

More information

Non-Uniform Sensor Deployment in Mobile Wireless Sensor Networks

Non-Uniform Sensor Deployment in Mobile Wireless Sensor Networks 01 01 01 01 01 00 01 01 Non-Uniform Sensor Deployment in Mobile Wireless Sensor Networks Mihaela Carei, Yinying Yang, an Jie Wu Department of Computer Science an Engineering Floria Atlantic University

More information

filtering LETTER An Improved Neighbor Selection Algorithm in Collaborative Taek-Hun KIM a), Student Member and Sung-Bong YANG b), Nonmember

filtering LETTER An Improved Neighbor Selection Algorithm in Collaborative Taek-Hun KIM a), Student Member and Sung-Bong YANG b), Nonmember 107 IEICE TRANS INF & SYST, VOLE88 D, NO5 MAY 005 LETTER An Improve Neighbor Selection Algorithm in Collaborative Filtering Taek-Hun KIM a), Stuent Member an Sung-Bong YANG b), Nonmember SUMMARY Nowaays,

More information

Adaptive Load Balancing based on IP Fast Reroute to Avoid Congestion Hot-spots

Adaptive Load Balancing based on IP Fast Reroute to Avoid Congestion Hot-spots Aaptive Loa Balancing base on IP Fast Reroute to Avoi Congestion Hot-spots Masaki Hara an Takuya Yoshihiro Faculty of Systems Engineering, Wakayama University 930 Sakaeani, Wakayama, 640-8510, Japan Email:

More information

Software Reliability Modeling and Cost Estimation Incorporating Testing-Effort and Efficiency

Software Reliability Modeling and Cost Estimation Incorporating Testing-Effort and Efficiency Software Reliability Moeling an Cost Estimation Incorporating esting-effort an Efficiency Chin-Yu Huang, Jung-Hua Lo, Sy-Yen Kuo, an Michael R. Lyu -+ Department of Electrical Engineering Computer Science

More information

Additional Divide and Conquer Algorithms. Skipping from chapter 4: Quicksort Binary Search Binary Tree Traversal Matrix Multiplication

Additional Divide and Conquer Algorithms. Skipping from chapter 4: Quicksort Binary Search Binary Tree Traversal Matrix Multiplication Aitional Divie an Conquer Algorithms Skipping from chapter 4: Quicksort Binary Search Binary Tree Traversal Matrix Multiplication Divie an Conquer Closest Pair Let s revisit the closest pair problem. Last

More information

Intensive Hypercube Communication: Prearranged Communication in Link-Bound Machines 1 2

Intensive Hypercube Communication: Prearranged Communication in Link-Bound Machines 1 2 This paper appears in J. of Parallel an Distribute Computing 10 (1990), pp. 167 181. Intensive Hypercube Communication: Prearrange Communication in Link-Boun Machines 1 2 Quentin F. Stout an Bruce Wagar

More information

Generalized Edge Coloring for Channel Assignment in Wireless Networks

Generalized Edge Coloring for Channel Assignment in Wireless Networks Generalize Ege Coloring for Channel Assignment in Wireless Networks Chun-Chen Hsu Institute of Information Science Acaemia Sinica Taipei, Taiwan Da-wei Wang Jan-Jan Wu Institute of Information Science

More information

Improving Performance of Sparse Matrix-Vector Multiplication

Improving Performance of Sparse Matrix-Vector Multiplication Improving Performance of Sparse Matrix-Vector Multiplication Ali Pınar Michael T. Heath Department of Computer Science an Center of Simulation of Avance Rockets University of Illinois at Urbana-Champaign

More information

MORA: a Movement-Based Routing Algorithm for Vehicle Ad Hoc Networks

MORA: a Movement-Based Routing Algorithm for Vehicle Ad Hoc Networks : a Movement-Base Routing Algorithm for Vehicle A Hoc Networks Fabrizio Granelli, Senior Member, Giulia Boato, Member, an Dzmitry Kliazovich, Stuent Member Abstract Recent interest in car-to-car communications

More information

Genetic Fuzzy Logic Control Technique for a Mobile Robot Tracking a Moving Target

Genetic Fuzzy Logic Control Technique for a Mobile Robot Tracking a Moving Target .IJCSI.org 67 Genetic Fuzzy Logic Control Technique for a Mobile Robot Tracking a Moving Target Karim Benbouaballah an Zhu Qi-an College of Automation, Harbin Engineering University Harbin, 151, China

More information

NAND flash memory is widely used as a storage

NAND flash memory is widely used as a storage 1 : Buffer-Aware Garbage Collection for Flash-Base Storage Systems Sungjin Lee, Dongkun Shin Member, IEEE, an Jihong Kim Member, IEEE Abstract NAND flash-base storage evice is becoming a viable storage

More information

Architecture Design of Mobile Access Coordinated Wireless Sensor Networks

Architecture Design of Mobile Access Coordinated Wireless Sensor Networks Architecture Design of Mobile Access Coorinate Wireless Sensor Networks Mai Abelhakim 1 Leonar E. Lightfoot Jian Ren 1 Tongtong Li 1 1 Department of Electrical & Computer Engineering, Michigan State University,

More information

ACE: And/Or-parallel Copying-based Execution of Logic Programs

ACE: And/Or-parallel Copying-based Execution of Logic Programs ACE: An/Or-parallel Copying-base Execution of Logic Programs Gopal GuptaJ Manuel Hermenegilo* Enrico PontelliJ an Vítor Santos Costa' Abstract In this paper we present a novel execution moel for parallel

More information

THE APPLICATION OF ARTICLE k-th SHORTEST TIME PATH ALGORITHM

THE APPLICATION OF ARTICLE k-th SHORTEST TIME PATH ALGORITHM International Journal of Physics an Mathematical Sciences ISSN: 2277-2111 (Online) 2016 Vol. 6 (1) January-March, pp. 24-6/Mao an Shi. THE APPLICATION OF ARTICLE k-th SHORTEST TIME PATH ALGORITHM Hua Mao

More information

MR DAMPER OPTIMAL PLACEMENT FOR SEMI-ACTIVE CONTROL OF BUILDINGS USING AN EFFICIENT MULTI- OBJECTIVE BINARY GENETIC ALGORITHM

MR DAMPER OPTIMAL PLACEMENT FOR SEMI-ACTIVE CONTROL OF BUILDINGS USING AN EFFICIENT MULTI- OBJECTIVE BINARY GENETIC ALGORITHM 24th International Symposium on on Automation & Robotics in in Construction (ISARC 2007) Construction Automation Group I.I.T. Maras MR DAMPER OPTIMAL PLACEMENT FOR SEMI-ACTIVE CONTROL OF BUILDINGS USING

More information

THE BAYESIAN RECEIVER OPERATING CHARACTERISTIC CURVE AN EFFECTIVE APPROACH TO EVALUATE THE IDS PERFORMANCE

THE BAYESIAN RECEIVER OPERATING CHARACTERISTIC CURVE AN EFFECTIVE APPROACH TO EVALUATE THE IDS PERFORMANCE БСУ Международна конференция - 2 THE BAYESIAN RECEIVER OPERATING CHARACTERISTIC CURVE AN EFFECTIVE APPROACH TO EVALUATE THE IDS PERFORMANCE Evgeniya Nikolova, Veselina Jecheva Burgas Free University Abstract:

More information

A Modified Immune Network Optimization Algorithm

A Modified Immune Network Optimization Algorithm IAENG International Journal of Computer Science, 4:4, IJCS_4_4_3 A Moifie Immune Network Optimization Algorithm Lu Hong, Joarer Kamruzzaman Abstract This stuy proposes a moifie artificial immune network

More information

AnyTraffic Labeled Routing

AnyTraffic Labeled Routing AnyTraffic Labele Routing Dimitri Papaimitriou 1, Pero Peroso 2, Davie Careglio 2 1 Alcatel-Lucent Bell, Antwerp, Belgium Email: imitri.papaimitriou@alcatel-lucent.com 2 Universitat Politècnica e Catalunya,

More information

A Neural Network Model Based on Graph Matching and Annealing :Application to Hand-Written Digits Recognition

A Neural Network Model Based on Graph Matching and Annealing :Application to Hand-Written Digits Recognition ITERATIOAL JOURAL OF MATHEMATICS AD COMPUTERS I SIMULATIO A eural etwork Moel Base on Graph Matching an Annealing :Application to Han-Written Digits Recognition Kyunghee Lee Abstract We present a neural

More information

Shift-map Image Registration

Shift-map Image Registration Shift-map Image Registration Svärm, Linus; Stranmark, Petter Unpublishe: 2010-01-01 Link to publication Citation for publishe version (APA): Svärm, L., & Stranmark, P. (2010). Shift-map Image Registration.

More information

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION 131 CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION 6.1 INTRODUCTION The Orthogonal arrays are helpful in guiding the heuristic algorithms to obtain a good solution when applied to NP-hard problems. This

More information

Loop Scheduling and Partitions for Hiding Memory Latencies

Loop Scheduling and Partitions for Hiding Memory Latencies Loop Scheuling an Partitions for Hiing Memory Latencies Fei Chen Ewin Hsing-Mean Sha Dept. of Computer Science an Engineering University of Notre Dame Notre Dame, IN 46556 Email: fchen,esha @cse.n.eu Tel:

More information

Figure 1: Schematic of an SEM [source: ]

Figure 1: Schematic of an SEM [source:   ] EECI Course: -9 May 1 by R. Sanfelice Hybri Control Systems Eelco van Horssen E.P.v.Horssen@tue.nl Project: Scanning Electron Microscopy Introuction In Scanning Electron Microscopy (SEM) a (bunle) beam

More information

Preamble. Singly linked lists. Collaboration policy and academic integrity. Getting help

Preamble. Singly linked lists. Collaboration policy and academic integrity. Getting help CS2110 Spring 2016 Assignment A. Linke Lists Due on the CMS by: See the CMS 1 Preamble Linke Lists This assignment begins our iscussions of structures. In this assignment, you will implement a structure

More information

Improving Spatial Reuse of IEEE Based Ad Hoc Networks

Improving Spatial Reuse of IEEE Based Ad Hoc Networks mproving Spatial Reuse of EEE 82.11 Base A Hoc Networks Fengji Ye, Su Yi an Biplab Sikar ECSE Department, Rensselaer Polytechnic nstitute Troy, NY 1218 Abstract n this paper, we evaluate an suggest methos

More information

Optimal path planning in a constant wind with a bounded turning rate

Optimal path planning in a constant wind with a bounded turning rate Optimal path planning in a constant win with a boune turning rate Timothy G. McGee, Stephen Spry an J. Karl Herick Center for Collaborative Control of Unmanne Vehicles, University of California, Berkeley,

More information

Verifying performance-based design objectives using assemblybased vulnerability

Verifying performance-based design objectives using assemblybased vulnerability Verying performance-base esign objectives using assemblybase vulnerability K.A. Porter Calornia Institute of Technology, Pasaena, Calornia, USA A.S. Kiremijian Stanfor University, Stanfor, Calornia, USA

More information

Learning convex bodies is hard

Learning convex bodies is hard Learning convex boies is har Navin Goyal Microsoft Research Inia navingo@microsoftcom Luis Raemacher Georgia Tech lraemac@ccgatecheu Abstract We show that learning a convex boy in R, given ranom samples

More information

The Reconstruction of Graphs. Dhananjay P. Mehendale Sir Parashurambhau College, Tilak Road, Pune , India. Abstract

The Reconstruction of Graphs. Dhananjay P. Mehendale Sir Parashurambhau College, Tilak Road, Pune , India. Abstract The Reconstruction of Graphs Dhananay P. Mehenale Sir Parashurambhau College, Tila Roa, Pune-4030, Inia. Abstract In this paper we iscuss reconstruction problems for graphs. We evelop some new ieas lie

More information

Multilevel Linear Dimensionality Reduction using Hypergraphs for Data Analysis

Multilevel Linear Dimensionality Reduction using Hypergraphs for Data Analysis Multilevel Linear Dimensionality Reuction using Hypergraphs for Data Analysis Haw-ren Fang Department of Computer Science an Engineering University of Minnesota; Minneapolis, MN 55455 hrfang@csumneu ABSTRACT

More information

IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, VOL. 31, NO. 4, APRIL

IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, VOL. 31, NO. 4, APRIL IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, VOL. 1, NO. 4, APRIL 01 74 Towar Efficient Distribute Algorithms for In-Network Binary Operator Tree Placement in Wireless Sensor Networks Zongqing Lu,

More information

Chapter 9 Memory Management

Chapter 9 Memory Management Contents 1. Introuction 2. Computer-System Structures 3. Operating-System Structures 4. Processes 5. Threas 6. CPU Scheuling 7. Process Synchronization 8. Dealocks 9. Memory Management 10.Virtual Memory

More information

d 3 d 4 d d d d d d d d d d d 1 d d d d d d

d 3 d 4 d d d d d d d d d d d 1 d d d d d d Proceeings of the IASTED International Conference Software Engineering an Applications (SEA') October 6-, 1, Scottsale, Arizona, USA AN OBJECT-ORIENTED APPROACH FOR MANAGING A NETWORK OF DATABASES Shu-Ching

More information

Nature Inspired Meta-heuristics: A Survey

Nature Inspired Meta-heuristics: A Survey Nature Inspired Meta-heuristics: A Survey Nidhi Saini Student, Computer Science & Engineering DAV Institute of Engineering and Technology Jalandhar, India Abstract: Nature provides a major inspiration

More information

Cellular Ants: Combining Ant-Based Clustering with Cellular Automata

Cellular Ants: Combining Ant-Based Clustering with Cellular Automata Cellular Ants: Combining Ant-Base Clustering with Cellular Automata Anrew Vane Moere & Justin James Clayen Key Centre of Design Computing & Cognition, University of Syney, Australia {anrew,justin}@arch.usy.eu.au

More information

Algorithm Design (4) Metaheuristics

Algorithm Design (4) Metaheuristics Algorithm Design (4) Metaheuristics Takashi Chikayama School of Engineering The University of Tokyo Formalization of Constraint Optimization Minimize (or maximize) the objective function f(x 0,, x n )

More information

An Adaptive Routing Algorithm for Communication Networks using Back Pressure Technique

An Adaptive Routing Algorithm for Communication Networks using Back Pressure Technique International OPEN ACCESS Journal Of Moern Engineering Research (IJMER) An Aaptive Routing Algorithm for Communication Networks using Back Pressure Technique Khasimpeera Mohamme 1, K. Kalpana 2 1 M. Tech

More information

SURVIVABLE IP OVER WDM: GUARANTEEEING MINIMUM NETWORK BANDWIDTH

SURVIVABLE IP OVER WDM: GUARANTEEEING MINIMUM NETWORK BANDWIDTH SURVIVABLE IP OVER WDM: GUARANTEEEING MINIMUM NETWORK BANDWIDTH Galen H Sasaki Dept Elec Engg, U Hawaii 2540 Dole Street Honolul HI 96822 USA Ching-Fong Su Fuitsu Laboratories of America 595 Lawrence Expressway

More information

Distributed Line Graphs: A Universal Technique for Designing DHTs Based on Arbitrary Regular Graphs

Distributed Line Graphs: A Universal Technique for Designing DHTs Based on Arbitrary Regular Graphs IEEE TRANSACTIONS ON KNOWLEDE AND DATA ENINEERIN, MANUSCRIPT ID Distribute Line raphs: A Universal Technique for Designing DHTs Base on Arbitrary Regular raphs Yiming Zhang an Ling Liu, Senior Member,

More information

Blind Data Classification using Hyper-Dimensional Convex Polytopes

Blind Data Classification using Hyper-Dimensional Convex Polytopes Blin Data Classification using Hyper-Dimensional Convex Polytopes Brent T. McBrie an Gilbert L. Peterson Department of Electrical an Computer Engineering Air Force Institute of Technology 9 Hobson Way

More information

A Classification of 3R Orthogonal Manipulators by the Topology of their Workspace

A Classification of 3R Orthogonal Manipulators by the Topology of their Workspace A Classification of R Orthogonal Manipulators by the Topology of their Workspace Maher aili, Philippe Wenger an Damien Chablat Institut e Recherche en Communications et Cybernétique e Nantes, UMR C.N.R.S.

More information

Learning Subproblem Complexities in Distributed Branch and Bound

Learning Subproblem Complexities in Distributed Branch and Bound Learning Subproblem Complexities in Distribute Branch an Boun Lars Otten Department of Computer Science University of California, Irvine lotten@ics.uci.eu Rina Dechter Department of Computer Science University

More information

Frequency Domain Parameter Estimation of a Synchronous Generator Using Bi-objective Genetic Algorithms

Frequency Domain Parameter Estimation of a Synchronous Generator Using Bi-objective Genetic Algorithms Proceeings of the 7th WSEAS International Conference on Simulation, Moelling an Optimization, Beijing, China, September 15-17, 2007 429 Frequenc Domain Parameter Estimation of a Snchronous Generator Using

More information

Journal of Cybernetics and Informatics. Slovak Society for Cybernetics and Informatics

Journal of Cybernetics and Informatics. Slovak Society for Cybernetics and Informatics Journal of Cybernetics an Informatics publishe by Slovak Society for Cybernetics an Informatics Volume 13, 1 http://www.sski.sk/casopis/inex.php (home page) ISSN: 1336-4774 Journal of Cybernetics an Informatics

More information

State Indexed Policy Search by Dynamic Programming. Abstract. 1. Introduction. 2. System parameterization. Charles DuHadway

State Indexed Policy Search by Dynamic Programming. Abstract. 1. Introduction. 2. System parameterization. Charles DuHadway State Inexe Policy Search by Dynamic Programming Charles DuHaway Yi Gu 5435537 503372 December 4, 2007 Abstract We consier the reinforcement learning problem of simultaneous trajectory-following an obstacle

More information

A Duality Based Approach for Realtime TV-L 1 Optical Flow

A Duality Based Approach for Realtime TV-L 1 Optical Flow A Duality Base Approach for Realtime TV-L 1 Optical Flow C. Zach 1, T. Pock 2, an H. Bischof 2 1 VRVis Research Center 2 Institute for Computer Graphics an Vision, TU Graz Abstract. Variational methos

More information

Optimization of cable-stayed bridges with box-girder decks

Optimization of cable-stayed bridges with box-girder decks Avances in Engineering Software 31 (2000) 417 423 www.elsevier.com/locate/avengsoft Optimization of cable-staye briges with box-girer ecks L.M.C. Simões*, J.H.J.O. Negrão Department of Civil Engineering,

More information

Shift-map Image Registration

Shift-map Image Registration Shift-map Image Registration Linus Svärm Petter Stranmark Centre for Mathematical Sciences, Lun University {linus,petter}@maths.lth.se Abstract Shift-map image processing is a new framework base on energy

More information

On the Placement of Internet Taps in Wireless Neighborhood Networks

On the Placement of Internet Taps in Wireless Neighborhood Networks 1 On the Placement of Internet Taps in Wireless Neighborhoo Networks Lili Qiu, Ranveer Chanra, Kamal Jain, Mohamma Mahian Abstract Recently there has emerge a novel application of wireless technology that

More information

Classical Mechanics Examples (Lagrange Multipliers)

Classical Mechanics Examples (Lagrange Multipliers) Classical Mechanics Examples (Lagrange Multipliers) Dipan Kumar Ghosh Physics Department, Inian Institute of Technology Bombay Powai, Mumbai 400076 September 3, 015 1 Introuction We have seen that the

More information

Ant Colony Optimization

Ant Colony Optimization Ant Colony Optimization CompSci 760 Patricia J Riddle 1 Natural Inspiration The name Ant Colony Optimization was chosen to reflect its original inspiration: the foraging behavior of some ant species. It

More information

Parts Assembly by Throwing Manipulation with a One-Joint Arm

Parts Assembly by Throwing Manipulation with a One-Joint Arm 21 IEEE/RSJ International Conference on Intelligent Robots an Systems, Taipei, Taiwan, October, 21. Parts Assembly by Throwing Manipulation with a One-Joint Arm Hieyuki Miyashita, Tasuku Yamawaki an Masahito

More information

Lab work #8. Congestion control

Lab work #8. Congestion control TEORÍA DE REDES DE TELECOMUNICACIONES Grao en Ingeniería Telemática Grao en Ingeniería en Sistemas e Telecomunicación Curso 2015-2016 Lab work #8. Congestion control (1 session) Author: Pablo Pavón Mariño

More information