Genetic algorithms with shrinking population size

Save this PDF as:
 WORD  PNG  TXT  JPG

Size: px
Start display at page:

Download "Genetic algorithms with shrinking population size"

Transcription

1 Comput Stat (2010) 25: DOI /s ORIGINAL PAPER Genetic algorithms with shrinking population size Joshua W. Hallam Olcay Akman Füsun Akman Received: 31 July 2008 / Accepted: 7 April 2010 / Published online: 22 April 2010 Springer-Verlag 2010 Abstract A Genetic Algorithm (GA) is an evolutionary computation technique inspired by the principle of biological evolution via natural selection. It employs the fundamental components of evolution, such as selection, mating, and mutation, which continue from generation to generation, creating better solutions as time progresses. Although it is mostly used as an optimization tool, GA enjoys a wide spectrum of applications in diverse fields such as engineering, medicine, and ecology, among others. In this study, we propose three different population size reduction methods for a typical GA optimization, aiming to increase efficiency. Additionally, we compare the accuracy and precision of these methods using Monte Carlo simulations. Keywords Genetic algorithms Population reduction Adaptive Exponential Linear reduction 1 Introduction A genetic algorithm is an optimization technique inspired by biological evolution operating under natural selection. First popularized by Holland (1975) and extensively studied by Goldberg (1989), this technique has been shown to be robust, and capable of dealing with highly multimodal and discontinuous search landscapes, where the traditional optimization techniques fail. Traditional methods such as hill-climbing Supported by program of excellence award from Illinois State University. J. W. Hallam O. Akman (B) F. Akman Department of Mathematics, Illinois State University, Campus Box 4520, Normal, IL , USA

2 692 J. W. Hallam et al. and derivative-based methods are able to find optimal points, but with multimodal landscapes, they may get stuck in local optima, whereas the structure of genetic algorithms help avoid this problem. In genetic algorithms, a group of possible solutions, i.e., a population of chromosomes in genetic algorithm terminology, are evaluated and given a fitness value based on this evaluation. The chromosomes with large fitness values are allowed to mate with other chromosomes, mutate, and move on to the next generation. This process is repeated until either a certain number of generations are reached or there is no change in the best solution found for many generations. At the end of the algorithm, the chromosome with the highest fitness is considered to be the solution. In order to take advantage of the process inspired by evolution and natural selection, chromosomes are encoded using a binary string. Let l denote the length of the string. Typically, if the function being optimized has n independent variables, then l is an integer multiple of n. The binary string is then broken into equal parts of length n, each representing one of the n variables, and converted into a real number based on the range of possible values for the variable. The fitness associated with the chromosome is calculated by evaluating the function being optimized at the n real values for each of the variables. More formally, if f : R n R denotes the function being optimized, and g :{0, 1} l R n gives the transformation from the binary string to the real values, then the fitness of a chromosome is calculated as fitness = f (g(chrom)), and the chromosomes with large fitness are chosen for the next generation. However, this choice is not deterministic. Instead, two chromosomes are selected at random, the one with the higher fitness is chosen, and the other is put back. This technique is called binary tournament. Chromosomes can be chosen more than once in a tournament. The chosen chromosomes are then put in the mating pool. This process continues until the mating pool reaches the size of the population in the next generation. Then two chromosomes are chosen from the pool and mated. This mating is analogous to genetic recombination, in which segments of the code are swapped between the two chromosomes. The number of crossover points is up to the user, but in our work we used three. After the mating occurs, the two new chromosomes are mutated. With a certain small probability, each bit may be changed from 0 to 1 or 1 to 0. This process of crossover and mutation creates two new chromosomes, which will be put into the next generation s population. This continues until all pairs in the mating pool are mated. The process of selection, mating, and mutation continue from generation to generation, creating better solutions as time progresses. In a typical genetic algorithm, the population size remains constant throughout the entire algorithm. The conditions under which the theoretical convergence (when the number of generations tends to infinity) is assured for constant size genetic algorithm are in Bhandari et al. (1996). In this study, we propose a genetic algorithm that reduces population size at every time step. This reduction initially allows for a larger population size. With a larger size, the genetic pool is more expansive, and the algorithm has a better chance of selecting parts of the correct solution early. Additionally, the reduction is controlled by user-defined parameters which do not allow the population to be reduced drastically, to avoid being trapped in local optima. We believe that reducing population size will enable the algorithm to find the correct solution more efficiently.

3 Genetic algorithms with shrinking population size 693 In Sect. 2 we study different methods of population reduction, while in Sect. 3, we examine the efficiency of these methods. Section 4 contains simulation results, followed by a discussion in Sect Methods of population reduction We have developed three different methods of population reduction for genetic algorithms. The first is an adaptive measure and the other two are based on a predetermined pattern. We describe these three methods in detail below. 2.1 Adaptive population reduction Adaptively sizing population is defined as continually changing the population size based on parameters within the algorithm. These changes could include those in average fitness and genetic variance. This method contrasts with predetermined sizing methods, in which the population size at each generation is unaffected by changes in the algorithm. Adaptive measures have been offered by several authors and a review of current methods can be found in Lima and Lobo (2005). Here we present a new method based on the change in best fitness. This approach was used before by Eiben et al. (2004). Their method was to increase the population size if the best fitness increases, decrease the size if there is a short term lack of fitness increase, and increase the population size if no change occurs for a long period of time. This approach may have several problems with it. For example, if the population is increased, then new chromosomes must be created. However, if the chromosomes are just created by cloning existing chromosomes, then there has not been an increase in genetic diversity. In the Eiben et al. study (2004) the best individuals were cloned, which did not increase genetic diversity. It would be more beneficial, in theory, to generate random individuals to simulate gene flow. Another problem is that typically in a genetic algorithm the fitness increases the quickest early in the algorithm, which would imply that the population size grows early in the algorithm. If the individuals are cloned, then the population will lose genetic diversity even faster because of the dominance of the numerous clones with large fitness. It seems, if the population size will likely increase early in the algorithm, that simply starting with a larger randomly generated population and not increasing population size would be better because of higher genetic diversity, and the same amount of computation would be used. Our approach takes the opposite view to Eiben et al. (2004). We believe that as the best fitness increases we may reduce the population size and obtain good results with less computation than a typical genetic algorithm. The only time population size is reduced is when the best fitness increases, and the method never increases the population size. To justify this, suppose we wish to optimize in a multimodal fitness landscape. If we start with a large population, then we can better explore this large landscape. However, as time continues, solutions will aggregate around a certain area in the landscape, and we can reduce the population size. Since the chromosome with the best fitness will be allowed to mate often, the solutions will concentrate around this solution. Thus, the change in best fitness is a good indicator of how well the

4 694 J. W. Hallam et al. algorithm is performing. The initial landscape is typically rugged and we need many chromosomes to explore it, but as the algorithm continues to run we can think of the problem as shrinking in ruggedness since we are concentrating on a smaller section of the landscape. A reduction in ruggedness allows for a smaller population to optimize the problem with the same or better results than a larger population. The small population size, with implementation of elitism, allows genetic drift to fine-tune the solution without losing the best solution in the process: suppose the population has aggregated in a small partition of the search space such that there are only slight changes in fitness. At this point, it is economical to have a small population, because a chromosome with a small difference in fitness has a better chance to be chosen to participate in a tournament. Although the choice to participate in the tournament is random, with a smaller population, every chromosome has a better chance to be chosen. Thus, those with a slightly better fitness can participate and be chosen for the mating pool. At the same time, this part of the algorithm is merely choosing between solutions which only differ little and it is less important than the phase of the algorithm making large jumps in fitness. We have developed a formula to quantify the amount of reduction. It is based on the idea that the population size should be reduced proportionally to the change in best fitness. Let N t be the population size at generation t. Denote the change in best fitness at generation t by ft best = ( ft 1 best f t 2 best ) / f best t 2. We use the absolute value to deal with fitness values which can be both positive and negative. We then determine a parameter fmax best such that ( ) 1 f best t Nt, if ft best fmax best ( ) N t+1 = 1 f best max Nt, if ft best > fmax best MIN_POPSIZE, if N t+1 is less than MIN_POPSIZE. As seen in Eq. 1, the size of the population of the next generation depends on several factors. The first is the percent change in best fitness ( ft best ). If this is below some threshold value ( fmax best ), then the population of the next generation is reduced by this percent change. If it is more, then the population is reduced by the threshold value. If either of these reductions reduces the size so that it is below the minimum size, then the size of the population is set to be the minimum size. When this type of decrease is used, we implement elitism, allowing the best chromosome to continue to the next generation without change, so that the change in fitness is always nonnegative. Clearly, we have fmax best < 1, since if f best max 1 then the population will be immediately reduced to the minimum size. As a side note, the typical genetic algorithm is a special case of the method we have produced, where fmax best = 0. The determination of minimum population size is arbitrary. However, to avoid the negative effects of extremely small populations, we set MIN_POPSIZE = 20 based on work by Reeves (1993). As can be seen from Eq. 1 and Fig. 1, the shape of the population curve is exponential decrease, followed by a steady section, again followed by an exponential decrease, and this pattern continues. In Fig. 1, the exact solution was not found before the 200th generation for Rastrigin s function and Rosenbrock s valley function. Rastrigin s function is a multimodal function which, in our experiments, (1)

5 Genetic algorithms with shrinking population size 695 Fig. 1 Population size for the adaptive method for the first 200 generations of several functions. Boxed in areas denote the generation where actual solution was found. The exact solution was not found in Rastrigin s function and Rosenbrock s valley function within 200 generations tended to need many generations to converge to the exact solution. Rosenbrock s valley function is unimodal, but very flat, requiring many generations to converge. 2.2 Predetermined exponential decrease Although the adaptive method produces a population curve which has segments of exponential decrease, it requires computing ft best at every generation, as well as the determination of fmax best. We now present a method which requires neither and reduces the population exponentially. Many theoretical results concerning genetic algorithms rest on Holland s Schema Theorem (Holland 1975). A schema can be thought of as parts of solutions (i.e., parts of a binary string) that come together to form more fit chromosomes. Holland was able to show that short schema with high fitness tend to increase in the population exponentially by crossover. This exponential increase brings highly fit segments of strings together to form better and better solutions. The Schema Theorem really underscores the importance of crossover, a characteristic of genetic algorithms that distinguishes it from other methods such as hill-climbing. Based on this theorem, we believe that we can reduce the population size exponentially and obtain results comparable to an algorithm which has no reduction. To perform this reduction, the following formula

6 696 J. W. Hallam et al. is used: N t = (N 0 + α)e c t α, wherec = ln N END α N 0 +α number of generations. (2) Here α is a parameter used to shape the population curve. In all of our studies α = 5 was chosen based on empirical evidence. Also, N END denotes the population size at the end of the algorithm. It is set to be 20, in agreement with the minimum population size used in the adaptive method. 2.3 Predetermined linear decrease It is not possible to predict the shape of the exponential increase of schema without direct and complicated calculations during the algorithm. Therefore, we have also developed a reduction method which is not exponential, but instead, which decreases the population size in a linear trend. This avoids decreasing the population too quickly, but has the benefit of reducing the number of computations needed in a traditional genetic algorithm. The following formula is used to determine the population size at each generation: N t = mt + N 0, where m = N 0 N END number of generations. (3) Again, we set N END = 20. Figure 2 depicts all three methods given above for population reduction and the typical (no reduction) method. 3 Testing reduction methods To determine the effectiveness of the reduction methods, we ran simulations using five sets of problems. Three of the five sets were based on massively multimodal functions and the others on unimodal functions. The three multimodal functions have different properties that make them interesting to compare. Although many real world problems for which genetic algorithms work well are usually multimodal, we chose to use two unimodal functions as well, in order to understand how our new reduction methods are affected by the shape of the fitness landscape. All five of the functions we used can be defined for an arbitrary number of independent variables. In all of the simulations each function had 2 10 variables. Tables 1 and 2 give information about the test functions in which n is the total number of independent variables and x j is the jth independent variable. 3.1 Fixed number of computations Two different types of simulations were run in testing the three new populationreduction techniques. In the first type the number of total computations was fixed.

7 Genetic algorithms with shrinking population size 697 Fig. 2 Population size for the exponential, linear, and adaptive reduction methods and the typical GA (no reduction) Table 1 Test functions and their equations Name Equation Type Sphere model ni=1 xi 2 Unimodal Rastrigin s function 10n + n i=1 (xi 2 10 cos (2π x i )) Multimodal ( Rosenbrock s valley function ni=2 100(x i xi 1 2 )2 + (1 x i 1 ) 2) Unimodal Schwefel s function ni=1 x i (sin x i ) Multimodal ( ni=1 ) x Ackley s path function 20 exp 0.2 i 2 n Multimodal ( ni=1 ) cos (2π x exp i ) n e (A computation refers to a single call to the function being optimized.) Using binary tournament, the number of function calls can be approximated by the number of chromosomes that will continue to the next generation. This is only an approximation, because if a chromosome is chosen more than once for the tournament, then its fitness does not need to be recalculated. After a fixed number of computations were performed, we determined the absolute difference between the best solution given by that run of the algorithm and the actual solution. For each simulation, 100 iterations

8 698 J. W. Hallam et al. Table 2 Test functions and their parameters Name Domain of x i Optimal values x Minimum Sphere model [ 6,6] (0,0,,0) 0 Rastrigin s function [ 6,6] (0,0,,0) 0 Rosenbrock s valley function [ 2.048,2.048] (1,1,,1) 0 Schwefel s function [ 500,500] ( , ,, ) n* Ackley s path function [ ,32.768] (0,0,,0) 0 Table 3 Number of computations allowed based on the number of independent variables Number of variables 2 10, , , , , , , , ,000 Number of computations allowed of the algorithm were run for the three different reduction methods in addition to the traditional genetic algorithm. Using the 100 solutions of each type, we performed t- tests to determine if any of the three new methods outperformed the traditional genetic algorithm. Also, we looked at the number of times the algorithm found the correct solution. We used two different types of functions to be optimized, each with a number of variables ranging from 2 to 10. As the number of variables increases, the difficulty in finding the optimal solution increases. In light of this, as the number of variables increased, we allowed more computations to be performed. Table 3 gives the number of computations allowed for a given number of variables. 3.2 Fixed solution with acceptable tolerance With the second type of simulation we performed, the number of computations was not fixed; instead, we let the algorithm run until the solution it gave was in a certain radius about the actual solution. Once the best solution was in the tolerance level, the algorithm terminated, and the number of computations was recorded. Only Rastrigin s function and the sphere model were used for this simulation. Again, 100 iterations of each method were used, and a t-test was performed to determine whether there was a significant difference in the number of computations, required by each method to be within the tolerance level. In the adaptive and typical methods, the choice of number of generations was arbitrary. This was not the case for the exponential and linear decrease,

9 Genetic algorithms with shrinking population size 699 Table 4 Initial population size (number of generations) for fixed number of computations simulation Number of variables Adaptive Exponential Linear Typical (500) 100 (250) 100 (250) 100 (100) (500) 100 (250) 100 (250) 100 (125) (500) 100 (250) 100 (250) 100 (150) (500) 150 (250) 100 (250) 150 (175) (500) 200 (250) 100 (250) 200 (200) (500) 250 (250) 100 (250) 250 (250) (750) 300 (300) 100 (300) 300 (300) (750) 300 (350) 100 (350) 300 (350) (750) 300 (400) 100 (400) 300 (400) Table 5 Initial population size (number of generations) for tolerance study using Rastrigin s function and sphere model respectively Number of variables Adaptive Exponential Linear Typical 2,3,4, ( ) ( ) ( ) ( ) 6,7,8,9, ( ) ( ) ( ) ( ) as their population curves depend on the total number of generations. When choosing the number of generations, we erred on the side of more computation. In effect, we set the number of generations higher than necessary, which caused the population curve to have a less steep decline. This may have skewed results slightly in the direction of overcomputation, but we feel that this outweighed the problem of an algorithm not reaching the tolerance level. 3.3 Simulation parameters In all the simulation runs, the probability of mutation was set at The binary string length was 15 bits per independent variable. The number of crossover points was 3 and each crossover point was chosen so that it occurred at a multiple of 15. For the adaptive reduction method, we set fmax best =.08. For the fixed tolerance level simulation, the tolerance level was.05. Table 4 gives the initial population size and the number of generations for the fixed number of computations simulation. Table 5 gives the same information for the fixed tolerance simulation. In both types of simulations, outliers were removed from the dataset before the t-tests were done. 4 Results 4.1 Fixed number of computations It is clear from the mean distance results in Table 6 and Figs. 3 and 4 that the adaptive reduction method performs well for a wide range of functions. It was never

10 700 J. W. Hallam et al. Table 6 Average distance from actual solution after outliers were removed Sphere Adaptive 6.71E E E E E E 7* 2.68E 7* 3.02E 7* 3.35E 7* Exponential 6.71E E E E E E E 6*- 1.03E 5*- 3.47E 5*- Linear 6.71E E E E E 6*- 1.33E 5*- 8.55E 5*- 1.90E 4*- 4.05E 4*- Typical 6.71E E E E E 7 3.5E E E E 5 Rastrigin s Adaptive 1.33E 5* 2E 5* 2.66E 5* 3.75E 5* 3.25E 1* 4.17E 1* 3.26E 1* 4.47E 1* 6.36E 1* Exponential 3.20E E E E E E E 1* 6.36E 1* 9.83E 1 Linear 3.55E E 1*- 5.44E E E 1* 4.56E E Typical 3.84E E E E E E E Rosenbrock s Adaptive 5.30E 3* 1.87E 1* 7.06E 1* 1.59* 2.49* 3.50* Exponential 2.32E 2*- 2.97E 1* * 4.27* Linear 2.53E 2*- 4.17E * * Typical 1.47E E Schwefel s Adaptive 1.52E 3* 3.45E 2* 6.99E 2* 1.27E 1* 1.64E 1* 2.33E 1* 5.98E 1* 8.53E 1* 1.15* Exponential 8.63E E E E E 1* 2.49E 1* 2.61E 1* 3.33E 1* 4.98E 1* Linear 7.35E E E E E 1* 2.30E 1* 3.09E 1* 4.16E 1* 6.31E 1* Typical 8.21E E E E E E+1 Ackley s Adaptive 4.05E E E E E E 3* 4.46E 3* 4.05E 3* 4.05E 3* Exp 4.05E E E E E E 3* 1.36E 2*- 2.00E 2*- 3.15E 2*- Linear 4.05E E E E E 3*- 2.43E 2*- 6.95E 2*- 9.96E 2*- 1.32E 1*- Typical 4.05E E E E E E E E E 2 * denotes significantly better than the typical genetic algorithm *- denotes significantly worse than the typical genetic algorithm, both at the α =.05 level outperformed by the typical genetic algorithm, and it outperformed the typical genetic algorithm in 34 out of the 45 total categories. This would imply that the adaptive method is robust and could be used on a number of functions with no loss of efficiency and usually with better performance. The predetermined exponential decreasing method was the next best, outperforming 11 times and being outperformed seven times. Finally, the linear reduction method was outperformed 12 times, and it outperformed the typical method nine times. Table 7 gives the fraction of replications that found the correct solution for each function with different number of variables and the average over all variables. In the table, the results for Rosenbrock s valley function were excluded because only once the correct solution was found. Again the results indicate that the adaptive reduction method is the preferred method. It had the largest average frequency of exact hits for three of the four functions considered.

11 Genetic algorithms with shrinking population size 701 Fig. 3 Results for fixed number of computations with 95% CI 4.2 Fixed solution with acceptable tolerance We turn our attention to the simulation in which the number of computations was not fixed, but the tolerance about the solution was fixed (See Table 8 for results). To reiterate, the algorithm ran until it gave a solution that was in the interval of [ 0.05,0.05] around the solution. As with the previous simulation, the adaptive reduction method was the best, finding the neighborhood with a significantly less amount of computation than the typical method in 17 out of the 18 categories. Additionally, the exponential decrease method came in second, outperforming eight times and being outperformed

12 702 J. W. Hallam et al. Fig. 4 Results for fixed number of computations with 95% CI. The legend is the same as in Fig. 3

13 Genetic algorithms with shrinking population size 703 Table 7 The proportion of replications in which the genetic algorithm got the correct solution for different number of variables Average Sphere Adapt Exp Linear Typical Rastrigin s Adapt Exp Linear Typical Schwefel s Adapt Exp Linear Typical Ackley s Adapt Exp Linear Typical The data for Rosenbrock s valley function was omitted because only one of the 3,600 replications found the correct solution four times. The linear decrease method once again did the worst among the reduction methods, outperforming only six times and being outperformed four times. It is interesting to note that with Rastrigin s function and the sphere model, once the algorithm has reached the.05 neighborhood, it is likely that the algorithm will find the correct solution as there is only one minimum in the.05 neighborhood. This is not true for any of the other test functions used in the first simulation. This is why this simulation was run with these two functions. With Rastrigin s function, the average number of computations sometimes exceeded the number of computations allowed for the first type of simulation. In the first simulation, not all the algorithms found the correct solution, supporting the concept that once in the neighborhood the correct solution would be found. 5 Discussion It is clear that the adaptive method was the best method of reduction, outperforming the typical method in almost all the simulations. When we first started the study, we hypothesized that the adaptive method would work well for complex functions. This

14 704 J. W. Hallam et al. Table 8 The average number of computations needed to get into the [.05,.05] neighborhood about the actual solution Sphere Adapt * 1272* 1510* 1736* 1941* 2184* 2401* 2671* Exp * 2163* * 3829* 4487* 5181* Linear * * 4907* 5664* Typical Rastrigin s Adapt 4295* 7810* 13667* 14532* 19673* 23801* 27951* 31944* 39837* Exp * 14127* * * * *- Linear * 14747* * * * *- Typical * denotes significantly better than the typical genetic algorithm *- denotes significantly worse than the typical genetic algorithm, both at the α = 0.05 level is true, but it also did well even for less complex problems. Arguably, the adaptive method should be preferred over a typical genetic algorithm. There are two drawbacks to the adaptive method: one is unique to the adaptive method, and the other is a general problem with population reduction techniques. The first is that the parameter fmax best has to be determined aprioriand usually ad hoc. There are some guidelines that can be given. First, if some information about the search space is known, and the user knows that large jumps in the fitness values are possible, then a lower value should be chosen. This prevents the population from reaching its minimum size too quickly. For example, if the functions f 1 (x) = 100x 2 and f 2 (x) = x 2 were being optimized, we would have f1max best < fbest 2max. Similarly, if the search space is relatively flat, then the parameter should be larger. This is most likely why the adaptive reduction method did not do that well on Rosenbrock s valley function. The search space is quite flat and the changes in fitness are very small, which made the population decrease slowly. In turn, this increased the number of computations quickly from generation to generation, and the algorithm had to terminate at an early generation because it hit the maximum number of computations. The second drawback to the adaptive method, actually to any reduction method, is that usually the initial size is chosen to be larger, and is then reduced to compare it to a typical genetic algorithm with the same number of computations, but with a steady population size. This comparison is fair for a serial implementation of a genetic algorithm, but for a parallel implementation, it may not be. Because most time is spent evaluating the fitness function, parallel implementation can be used to evaluate the chromosomes separately. This can reduce the long computation time caused by the bottleneck of fitness evaluation. However, with a shrinking population that starts with a larger initial size, parallel implementation may not be as beneficial. This is true as long as the number of independent processors is larger than the population size in the typical genetic algorithm. If the number of processors is less than or equal to the minimum number of chromosomes in a shrinking population, then

15 Genetic algorithms with shrinking population size 705 the parallel implementation for a typical genetic algorithm should not be significantly better than that of a population reducing genetic algorithm. The other two methods, exponential and linear decrease, did not do as well as the adaptive reduction method. Possibly this occurred because they reduce the population too slowly, which prevents a large initial size. The large initial size may have been the key factor in the ability of the adaptive method to do so well. Although these tests were done for a genetic algorithm with the same features as a typical genetic algorithm except for the changing population size, there is no reason to believe that the reduction techniques will not work well for genetic algorithms with other modifications. For example, a genetic algorithm with niching may work well with the adaptive reduction technique. In this scenario, each subpopulation could be reduced independently of the other subpopulations, or reduction may depend on the evolution of the entire system. References Bhandari D, Murthy CA, Pal SK (1996) Genetic algorithm with elitist model and its convergence. Intern J Pattern Recognit Artif Intell 10: Eiben AE, Marchiori E, Valkó VA (2004) Evolutionary algorithms with on-the-fly population size adjustment. In: Yao X et al (ed) Parallel problem solving from nature, PPSN VIII, Lecture notes in computer science, vol Springer, Berlin, pp Goldberg DE (1989) Genetic algorithms in search, optimization, and machine learning. Addison-Wesley, Reading Holland J (1975) Adaptation in natural and artificial systems. The MIT Press, Ann Arbor Lima CF, Lobo FG (2005) A review of adaptive population sizing schemes in genetic algorithms. In: Proceedings of the 2005 workshops on genetic and evolutionary computation. ACM, New York, pp Reeves CR (1993) Using genetic algorithms with small populations. In: Proceedings of the 5th international conference on genetic algorithms. Morgan Kaufmann Publishers, San Francisco, pp 92 97

Evolutionary Computation

Evolutionary Computation Evolutionary Computation BIO-INSPIRED OPTIMIZATION TECHNIQUES IN PARALLEL AND DISTRIBUTED SYSTEMS Inspiration Darwin Transmutation of species Evolution Origin of species 1809 Altering of one species into

More information

Genetic Algorithms. What is Evolutionary Computation? The Argument. 22c: 145, Chapter 4.3

Genetic Algorithms. What is Evolutionary Computation? The Argument. 22c: 145, Chapter 4.3 Genetic Algorithms 22c: 145, Chapter 4.3 What is Evolutionary Computation? An abstraction from the theory of biological evolution that is used to create optimization procedures or methodologies, usually

More information

Introduction To Genetic Algorithms

Introduction To Genetic Algorithms 1 Introduction To Genetic Algorithms Dr. Rajib Kumar Bhattacharjya Department of Civil Engineering IIT Guwahati Email: rkbc@iitg.ernet.in References 2 D. E. Goldberg, Genetic Algorithm In Search, Optimization

More information

Genetic Algorithms and Evolutionary Computation

Genetic Algorithms and Evolutionary Computation Genetic Algorithms and Evolutionary Computation Matteo Matteucci and Andrea Bonarini {matteucci,bonarini}@elet.polimi.it Department of Electronics and Information Politecnico di Milano Genetic Algorithms

More information

Non-Uniform Mapping in Binary-Coded Genetic Algorithms

Non-Uniform Mapping in Binary-Coded Genetic Algorithms Non-Uniform Mapping in Binary-Coded Genetic Algorithms Kalyanmoy Deb, Yashesh D. Dhebar, and N. V. R. Pavan Kanpur Genetic Algorithms Laboratory (KanGAL) Indian Institute of Technology Kanpur PIN 208016,

More information

11/14/2010 Intelligent Systems and Soft Computing 1

11/14/2010 Intelligent Systems and Soft Computing 1 Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Simulation of natural evolution Genetic algorithms Case study: maintenance scheduling with genetic

More information

Lecture 9 Evolutionary Computation: Genetic algorithms

Lecture 9 Evolutionary Computation: Genetic algorithms Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Simulation of natural evolution Genetic algorithms Case study: maintenance scheduling with genetic

More information

Optimizing CPU Scheduling Problem using Genetic Algorithms

Optimizing CPU Scheduling Problem using Genetic Algorithms Optimizing CPU Scheduling Problem using Genetic Algorithms Anu Taneja Amit Kumar Computer Science Department Hindu College of Engineering, Sonepat (MDU) anutaneja16@gmail.com amitkumar.cs08@pec.edu.in

More information

Holland s GA Schema Theorem

Holland s GA Schema Theorem Holland s GA Schema Theorem v Objective provide a formal model for the effectiveness of the GA search process. v In the following we will first approach the problem through the framework formalized by

More information

A Robust Method for Solving Transcendental Equations

A Robust Method for Solving Transcendental Equations www.ijcsi.org 413 A Robust Method for Solving Transcendental Equations Md. Golam Moazzam, Amita Chakraborty and Md. Al-Amin Bhuiyan Department of Computer Science and Engineering, Jahangirnagar University,

More information

Introduction to Evolutionary Computation

Introduction to Evolutionary Computation Introduction to Evolutionary Computation Patrick Reed Department of Civil and Environmental Engineering The Pennsylvania State University preed@engr.psu.edu Slide 1 Outline What is Evolutionary Computation?

More information

Numerical Research on Distributed Genetic Algorithm with Redundant

Numerical Research on Distributed Genetic Algorithm with Redundant Numerical Research on Distributed Genetic Algorithm with Redundant Binary Number 1 Sayori Seto, 2 Akinori Kanasugi 1,2 Graduate School of Engineering, Tokyo Denki University, Japan 10kme41@ms.dendai.ac.jp,

More information

that simple hill-climbing schemes would perform poorly because a large number of bit positions must be optimized simultaneously in order to move from

that simple hill-climbing schemes would perform poorly because a large number of bit positions must be optimized simultaneously in order to move from B.2.7.5: Fitness Landscapes: Royal Road Functions Melanie Mitchell Santa Fe Institute 1399 Hyde Park Road Santa Fe, NM 87501 mm@santafe.edu Stephanie Forrest Dept. of Computer Science University of New

More information

Quad Search and Hybrid Genetic Algorithms

Quad Search and Hybrid Genetic Algorithms Quad Search and Hybrid Genetic Algorithms Darrell Whitley, Deon Garrett, and Jean-Paul Watson whitley,garrett,watsonj @cs.colostate.edu Department of Computer Science, Colorado State University Fort Collins,

More information

Genetic Algorithm an Approach to Solve Global Optimization Problems

Genetic Algorithm an Approach to Solve Global Optimization Problems Genetic Algorithm an Approach to Solve Global Optimization Problems PRATIBHA BAJPAI Amity Institute of Information Technology, Amity University, Lucknow, Uttar Pradesh, India, pratibha_bajpai@rediffmail.com

More information

Evolutionary Computation

Evolutionary Computation Evolutionary Computation Cover evolutionary computation theory and paradigms Emphasize use of EC to solve practical problems Compare with other techniques - see how EC fits in with other approaches Definition:

More information

The Influence of Binary Representations of Integers on the Performance of Selectorecombinative Genetic Algorithms

The Influence of Binary Representations of Integers on the Performance of Selectorecombinative Genetic Algorithms The Influence of Binary Representations of Integers on the Performance of Selectorecombinative Genetic Algorithms Franz Rothlauf Working Paper 1/2002 February 2002 Working Papers in Information Systems

More information

An introduction to evolutionary computation

An introduction to evolutionary computation An introduction to evolutionary computation Andrea Roli Alma Mater Studiorum Università di Bologna Cesena campus andrea.roli@unibo.it Inspiring principle Evolutionary Computation is inspired by natural

More information

Alpha Cut based Novel Selection for Genetic Algorithm

Alpha Cut based Novel Selection for Genetic Algorithm Alpha Cut based Novel for Genetic Algorithm Rakesh Kumar Professor Girdhar Gopal Research Scholar Rajesh Kumar Assistant Professor ABSTRACT Genetic algorithm (GA) has several genetic operators that can

More information

Solving Timetable Scheduling Problem by Using Genetic Algorithms

Solving Timetable Scheduling Problem by Using Genetic Algorithms Solving Timetable Scheduling Problem by Using Genetic Algorithms Branimir Sigl, Marin Golub, Vedran Mornar Faculty of Electrical Engineering and Computing, University of Zagreb Unska 3, 1 Zagreb, Croatia

More information

F. Greene 6920 Roosevelt NE #126 Seattle, WA

F. Greene 6920 Roosevelt NE #126 Seattle, WA Effects of Diploid/Dominance on Stationary Genetic Search Proceedings of the Fifth Annual Conference on Evolutionary Programming. San Diego: MIT Press. F. Greene 6920 Roosevelt NE #126 Seattle, WA 98115

More information

Estimation of the COCOMO Model Parameters Using Genetic Algorithms for NASA Software Projects

Estimation of the COCOMO Model Parameters Using Genetic Algorithms for NASA Software Projects Journal of Computer Science 2 (2): 118-123, 2006 ISSN 1549-3636 2006 Science Publications Estimation of the COCOMO Model Parameters Using Genetic Algorithms for NASA Software Projects Alaa F. Sheta Computers

More information

Genetic Algorithm. Based on Darwinian Paradigm. Intrinsically a robust search and optimization mechanism. Conceptual Algorithm

Genetic Algorithm. Based on Darwinian Paradigm. Intrinsically a robust search and optimization mechanism. Conceptual Algorithm 24 Genetic Algorithm Based on Darwinian Paradigm Reproduction Competition Survive Selection Intrinsically a robust search and optimization mechanism Slide -47 - Conceptual Algorithm Slide -48 - 25 Genetic

More information

Comparison of Major Domination Schemes for Diploid Binary Genetic Algorithms in Dynamic Environments

Comparison of Major Domination Schemes for Diploid Binary Genetic Algorithms in Dynamic Environments Comparison of Maor Domination Schemes for Diploid Binary Genetic Algorithms in Dynamic Environments A. Sima UYAR and A. Emre HARMANCI Istanbul Technical University Computer Engineering Department Maslak

More information

Programming Risk Assessment Models for Online Security Evaluation Systems

Programming Risk Assessment Models for Online Security Evaluation Systems Programming Risk Assessment Models for Online Security Evaluation Systems Ajith Abraham 1, Crina Grosan 12, Vaclav Snasel 13 1 Machine Intelligence Research Labs, MIR Labs, http://www.mirlabs.org 2 Babes-Bolyai

More information

A Parallel Processor for Distributed Genetic Algorithm with Redundant Binary Number

A Parallel Processor for Distributed Genetic Algorithm with Redundant Binary Number A Parallel Processor for Distributed Genetic Algorithm with Redundant Binary Number 1 Tomohiro KAMIMURA, 2 Akinori KANASUGI 1 Department of Electronics, Tokyo Denki University, 07ee055@ms.dendai.ac.jp

More information

Using Genetic Algorithms with Asexual Transposition

Using Genetic Algorithms with Asexual Transposition Using Genetic Algorithms with Asexual Transposition Anabela Simões,, Ernesto Costa Centre for Informatics and Systems of the University of Coimbra,, Polo II, 3030 Coimbra, Portugal Leiria College of Education,

More information

Genetic Algorithms commonly used selection, replacement, and variation operators Fernando Lobo University of Algarve

Genetic Algorithms commonly used selection, replacement, and variation operators Fernando Lobo University of Algarve Genetic Algorithms commonly used selection, replacement, and variation operators Fernando Lobo University of Algarve Outline Selection methods Replacement methods Variation operators Selection Methods

More information

Genetic Algorithms. Part 2: The Knapsack Problem. Spring 2009 Instructor: Dr. Masoud Yaghini

Genetic Algorithms. Part 2: The Knapsack Problem. Spring 2009 Instructor: Dr. Masoud Yaghini Genetic Algorithms Part 2: The Knapsack Problem Spring 2009 Instructor: Dr. Masoud Yaghini Outline Genetic Algorithms: Part 2 Problem Definition Representations Fitness Function Handling of Constraints

More information

Genetic algorithms for changing environments

Genetic algorithms for changing environments Genetic algorithms for changing environments John J. Grefenstette Navy Center for Applied Research in Artificial Intelligence, Naval Research Laboratory, Washington, DC 375, USA gref@aic.nrl.navy.mil Abstract

More information

A Non-Linear Schema Theorem for Genetic Algorithms

A Non-Linear Schema Theorem for Genetic Algorithms A Non-Linear Schema Theorem for Genetic Algorithms William A Greene Computer Science Department University of New Orleans New Orleans, LA 70148 bill@csunoedu 504-280-6755 Abstract We generalize Holland

More information

A very brief introduction to genetic algorithms

A very brief introduction to genetic algorithms A very brief introduction to genetic algorithms Radoslav Harman Design of experiments seminar FACULTY OF MATHEMATICS, PHYSICS AND INFORMATICS COMENIUS UNIVERSITY IN BRATISLAVA 25.2.2013 Optimization problems:

More information

Genetic algorithms. Maximise f(x), xi Code every variable using binary string Eg.(00000) (11111)

Genetic algorithms. Maximise f(x), xi Code every variable using binary string Eg.(00000) (11111) Genetic algorithms Based on survival of the fittest. Start with population of points. Retain better points Based on natural selection. ( as in genetic processes) Genetic algorithms L u Maximise f(x), xi

More information

Copyright is owned by the Author of the thesis. Permission is given for a copy to be downloaded by an individual for the purpose of research and

Copyright is owned by the Author of the thesis. Permission is given for a copy to be downloaded by an individual for the purpose of research and Copyright is owned by the Author of the thesis. Permission is given for a copy to be downloaded by an individual for the purpose of research and private study only. The thesis may not be reproduced elsewhere

More information

Genetic Algorithms and Sudoku

Genetic Algorithms and Sudoku Genetic Algorithms and Sudoku Dr. John M. Weiss Department of Mathematics and Computer Science South Dakota School of Mines and Technology (SDSM&T) Rapid City, SD 57701-3995 john.weiss@sdsmt.edu MICS 2009

More information

Original Article Efficient Genetic Algorithm on Linear Programming Problem for Fittest Chromosomes

Original Article Efficient Genetic Algorithm on Linear Programming Problem for Fittest Chromosomes International Archive of Applied Sciences and Technology Volume 3 [2] June 2012: 47-57 ISSN: 0976-4828 Society of Education, India Website: www.soeagra.com/iaast/iaast.htm Original Article Efficient Genetic

More information

Evolutionary Computation: A Unified Approach

Evolutionary Computation: A Unified Approach Evolutionary Computation: A Unified Approach Kenneth De Jong Computer Science Department George Mason University kdejong@gmu.edu www.cs.gmu.edu/~eclab 1 Historical roots: Evolution Strategies (ESs): developed

More information

Learning in Abstract Memory Schemes for Dynamic Optimization

Learning in Abstract Memory Schemes for Dynamic Optimization Fourth International Conference on Natural Computation Learning in Abstract Memory Schemes for Dynamic Optimization Hendrik Richter HTWK Leipzig, Fachbereich Elektrotechnik und Informationstechnik, Institut

More information

Memory Allocation Technique for Segregated Free List Based on Genetic Algorithm

Memory Allocation Technique for Segregated Free List Based on Genetic Algorithm Journal of Al-Nahrain University Vol.15 (2), June, 2012, pp.161-168 Science Memory Allocation Technique for Segregated Free List Based on Genetic Algorithm Manal F. Younis Computer Department, College

More information

The Dynamics of a Genetic Algorithm on a Model Hard Optimization Problem

The Dynamics of a Genetic Algorithm on a Model Hard Optimization Problem The Dynamics of a Genetic Algorithm on a Model Hard Optimization Problem Alex Rogers Adam Prügel-Bennett Image, Speech, and Intelligent Systems Research Group, Department of Electronics and Computer Science,

More information

Th. Bäck Leiden Institute of Advanced Computer Science, Leiden University,The Netherlands

Th. Bäck Leiden Institute of Advanced Computer Science, Leiden University,The Netherlands EVOLUTIONARY COMPUTATION Th. Bäck Leiden Institute of Advanced Computer Science, Leiden University,The Netherlands Keywords: adaptation, evolution strategy, evolutionary programming, genetic algorithm,

More information

Genetic Algorithm Evolution of Cellular Automata Rules for Complex Binary Sequence Prediction

Genetic Algorithm Evolution of Cellular Automata Rules for Complex Binary Sequence Prediction Brill Academic Publishers P.O. Box 9000, 2300 PA Leiden, The Netherlands Lecture Series on Computer and Computational Sciences Volume 1, 2005, pp. 1-6 Genetic Algorithm Evolution of Cellular Automata Rules

More information

Modified Version of Roulette Selection for Evolution Algorithms - the Fan Selection

Modified Version of Roulette Selection for Evolution Algorithms - the Fan Selection Modified Version of Roulette Selection for Evolution Algorithms - the Fan Selection Adam S lowik, Micha l Bia lko Department of Electronic, Technical University of Koszalin, ul. Śniadeckich 2, 75-453 Koszalin,

More information

Using Genetic Programming to Learn Probability Distributions as Mutation Operators with Evolutionary Programming

Using Genetic Programming to Learn Probability Distributions as Mutation Operators with Evolutionary Programming Using Genetic Programming to Learn Probability Distributions as Mutation Operators with Evolutionary Programming James Bond, and Harry Potter The University of XXX Abstract. The mutation operator is the

More information

A Hybrid Tabu Search Method for Assembly Line Balancing

A Hybrid Tabu Search Method for Assembly Line Balancing Proceedings of the 7th WSEAS International Conference on Simulation, Modelling and Optimization, Beijing, China, September 15-17, 2007 443 A Hybrid Tabu Search Method for Assembly Line Balancing SUPAPORN

More information

Evolutionary SAT Solver (ESS)

Evolutionary SAT Solver (ESS) Ninth LACCEI Latin American and Caribbean Conference (LACCEI 2011), Engineering for a Smart Planet, Innovation, Information Technology and Computational Tools for Sustainable Development, August 3-5, 2011,

More information

Evolutionary Genetic Algorithms in a Constraint Satisfaction Problem: Puzzle Eternity II

Evolutionary Genetic Algorithms in a Constraint Satisfaction Problem: Puzzle Eternity II Evolutionary Genetic Algorithms in a Constraint Satisfaction Problem: Puzzle Eternity II Jorge Muñoz, German Gutierrez, and Araceli Sanchis University Carlos III of Madrid Avda. de la Universidad 30, 28911

More information

ISSN: 2319-5967 ISO 9001:2008 Certified International Journal of Engineering Science and Innovative Technology (IJESIT) Volume 2, Issue 3, May 2013

ISSN: 2319-5967 ISO 9001:2008 Certified International Journal of Engineering Science and Innovative Technology (IJESIT) Volume 2, Issue 3, May 2013 Transistor Level Fault Finding in VLSI Circuits using Genetic Algorithm Lalit A. Patel, Sarman K. Hadia CSPIT, CHARUSAT, Changa., CSPIT, CHARUSAT, Changa Abstract This paper presents, genetic based algorithm

More information

Self-Learning Genetic Algorithm for a Timetabling Problem with Fuzzy Constraints

Self-Learning Genetic Algorithm for a Timetabling Problem with Fuzzy Constraints Self-Learning Genetic Algorithm for a Timetabling Problem with Fuzzy Constraints Radomír Perzina, Jaroslav Ramík perzina(ramik)@opf.slu.cz Centre of excellence IT4Innovations Division of the University

More information

GA as a Data Optimization Tool for Predictive Analytics

GA as a Data Optimization Tool for Predictive Analytics GA as a Data Optimization Tool for Predictive Analytics Chandra.J 1, Dr.Nachamai.M 2,Dr.Anitha.S.Pillai 3 1Assistant Professor, Department of computer Science, Christ University, Bangalore,India, chandra.j@christunivesity.in

More information

Solving Three-objective Optimization Problems Using Evolutionary Dynamic Weighted Aggregation: Results and Analysis

Solving Three-objective Optimization Problems Using Evolutionary Dynamic Weighted Aggregation: Results and Analysis Solving Three-objective Optimization Problems Using Evolutionary Dynamic Weighted Aggregation: Results and Analysis Abstract. In this paper, evolutionary dynamic weighted aggregation methods are generalized

More information

New Modifications of Selection Operator in Genetic Algorithms for the Traveling Salesman Problem

New Modifications of Selection Operator in Genetic Algorithms for the Traveling Salesman Problem New Modifications of Selection Operator in Genetic Algorithms for the Traveling Salesman Problem Radovic, Marija; and Milutinovic, Veljko Abstract One of the algorithms used for solving Traveling Salesman

More information

The University of Algarve Informatics Laboratory

The University of Algarve Informatics Laboratory arxiv:cs/0602055v1 [cs.ne] 15 Feb 2006 The University of Algarve Informatics Laboratory UALG-ILAB Technical Report No. 200602 February, 2006 Revisiting Evolutionary Algorithms with On-the-Fly Population

More information

EFFICIENT GENETIC ALGORITHM ON LINEAR PROGRAMMING PROBLEM FOR FITTEST CHROMOSOMES

EFFICIENT GENETIC ALGORITHM ON LINEAR PROGRAMMING PROBLEM FOR FITTEST CHROMOSOMES Volume 3, No. 6, June 2012 Journal of Global Research in Computer Science RESEARCH PAPER Available Online at www.jgrcs.info EFFICIENT GENETIC ALGORITHM ON LINEAR PROGRAMMING PROBLEM FOR FITTEST CHROMOSOMES

More information

Asexual Versus Sexual Reproduction in Genetic Algorithms 1

Asexual Versus Sexual Reproduction in Genetic Algorithms 1 Asexual Versus Sexual Reproduction in Genetic Algorithms Wendy Ann Deslauriers (wendyd@alumni.princeton.edu) Institute of Cognitive Science,Room 22, Dunton Tower Carleton University, 25 Colonel By Drive

More information

Learning the Dominance in Diploid Genetic Algorithms for Changing Optimization Problems

Learning the Dominance in Diploid Genetic Algorithms for Changing Optimization Problems Learning the Dominance in Diploid Genetic Algorithms for Changing Optimization Problems Shengxiang Yang Abstract Using diploid representation with dominance scheme is one of the approaches developed for

More information

PLAANN as a Classification Tool for Customer Intelligence in Banking

PLAANN as a Classification Tool for Customer Intelligence in Banking PLAANN as a Classification Tool for Customer Intelligence in Banking EUNITE World Competition in domain of Intelligent Technologies The Research Report Ireneusz Czarnowski and Piotr Jedrzejowicz Department

More information

Level Sets of Arbitrary Dimension Polynomials with Positive Coefficients and Real Exponents

Level Sets of Arbitrary Dimension Polynomials with Positive Coefficients and Real Exponents Level Sets of Arbitrary Dimension Polynomials with Positive Coefficients and Real Exponents Spencer Greenberg April 20, 2006 Abstract In this paper we consider the set of positive points at which a polynomial

More information

Keywords: Travelling Salesman Problem, Map Reduce, Genetic Algorithm. I. INTRODUCTION

Keywords: Travelling Salesman Problem, Map Reduce, Genetic Algorithm. I. INTRODUCTION ISSN: 2321-7782 (Online) Impact Factor: 6.047 Volume 4, Issue 6, June 2016 International Journal of Advance Research in Computer Science and Management Studies Research Article / Survey Paper / Case Study

More information

A Binary Model on the Basis of Imperialist Competitive Algorithm in Order to Solve the Problem of Knapsack 1-0

A Binary Model on the Basis of Imperialist Competitive Algorithm in Order to Solve the Problem of Knapsack 1-0 212 International Conference on System Engineering and Modeling (ICSEM 212) IPCSIT vol. 34 (212) (212) IACSIT Press, Singapore A Binary Model on the Basis of Imperialist Competitive Algorithm in Order

More information

Lab 4: 26 th March 2012. Exercise 1: Evolutionary algorithms

Lab 4: 26 th March 2012. Exercise 1: Evolutionary algorithms Lab 4: 26 th March 2012 Exercise 1: Evolutionary algorithms 1. Found a problem where EAs would certainly perform very poorly compared to alternative approaches. Explain why. Suppose that we want to find

More information

HYBRID GENETIC ALGORITHM PARAMETER EFFECTS FOR OPTIMIZATION OF CONSTRUCTION RESOURCE ALLOCATION PROBLEM. Jin-Lee KIM 1, M. ASCE

HYBRID GENETIC ALGORITHM PARAMETER EFFECTS FOR OPTIMIZATION OF CONSTRUCTION RESOURCE ALLOCATION PROBLEM. Jin-Lee KIM 1, M. ASCE 1560 HYBRID GENETIC ALGORITHM PARAMETER EFFECTS FOR OPTIMIZATION OF CONSTRUCTION RESOURCE ALLOCATION PROBLEM Jin-Lee KIM 1, M. ASCE 1 Assistant Professor, Department of Civil Engineering and Construction

More information

BMOA: Binary Magnetic Optimization Algorithm

BMOA: Binary Magnetic Optimization Algorithm International Journal of Machine Learning and Computing Vol. 2 No. 3 June 22 BMOA: Binary Magnetic Optimization Algorithm SeyedAli Mirjalili and Siti Zaiton Mohd Hashim Abstract Recently the behavior of

More information

Inertia Weight Strategies in Particle Swarm Optimization

Inertia Weight Strategies in Particle Swarm Optimization Inertia Weight Strategies in Particle Swarm Optimization 1 J. C. Bansal, 2 P. K. Singh 3 Mukesh Saraswat, 4 Abhishek Verma, 5 Shimpi Singh Jadon, 6,7 Ajith Abraham 1,2,3,4,5 ABV-Indian Institute of Information

More information

Genetic Placement Benjamin Kopp Ece556 fall Introduction. Motivation. Genie Specification and Overview

Genetic Placement Benjamin Kopp Ece556 fall Introduction. Motivation. Genie Specification and Overview Genetic Placement Benjamin Kopp Ece556 fall 2004 Originally proposed by James P. Cohoon and William D Paris 1987 IEEE Introduction Genetic algorithms are a state space search similar in nature to simulated

More information

Selection Procedures for Module Discovery: Exploring Evolutionary Algorithms for Cognitive Science

Selection Procedures for Module Discovery: Exploring Evolutionary Algorithms for Cognitive Science Selection Procedures for Module Discovery: Exploring Evolutionary Algorithms for Cognitive Science Janet Wiles (j.wiles@csee.uq.edu.au) Ruth Schulz (ruth@csee.uq.edu.au) Scott Bolland (scottb@csee.uq.edu.au)

More information

A Novel Binary Particle Swarm Optimization

A Novel Binary Particle Swarm Optimization Proceedings of the 5th Mediterranean Conference on T33- A Novel Binary Particle Swarm Optimization Motaba Ahmadieh Khanesar, Member, IEEE, Mohammad Teshnehlab and Mahdi Aliyari Shoorehdeli K. N. Toosi

More information

Evolutionary Prefetching and Caching in an Independent Storage Units Model

Evolutionary Prefetching and Caching in an Independent Storage Units Model Evolutionary Prefetching and Caching in an Independent Units Model Athena Vakali Department of Informatics Aristotle University of Thessaloniki, Greece E-mail: avakali@csdauthgr Abstract Modern applications

More information

Markov chains and Markov Random Fields (MRFs)

Markov chains and Markov Random Fields (MRFs) Markov chains and Markov Random Fields (MRFs) 1 Why Markov Models We discuss Markov models now. This is the simplest statistical model in which we don t assume that all variables are independent; we assume

More information

Goldberg, D. E. (1989). Genetic algorithms in search, optimization, and machine learning. Reading, MA:

Goldberg, D. E. (1989). Genetic algorithms in search, optimization, and machine learning. Reading, MA: is another objective that the GA could optimize. The approach used here is also adaptable. On any particular project, the designer can congure the GA to focus on optimizing certain constraints (such as

More information

A HYBRID GENETIC ALGORITHM FOR THE MAXIMUM LIKELIHOOD ESTIMATION OF MODELS WITH MULTIPLE EQUILIBRIA: A FIRST REPORT

A HYBRID GENETIC ALGORITHM FOR THE MAXIMUM LIKELIHOOD ESTIMATION OF MODELS WITH MULTIPLE EQUILIBRIA: A FIRST REPORT New Mathematics and Natural Computation Vol. 1, No. 2 (2005) 295 303 c World Scientific Publishing Company A HYBRID GENETIC ALGORITHM FOR THE MAXIMUM LIKELIHOOD ESTIMATION OF MODELS WITH MULTIPLE EQUILIBRIA:

More information

(Refer Slide Time: 00:00:56 min)

(Refer Slide Time: 00:00:56 min) Numerical Methods and Computation Prof. S.R.K. Iyengar Department of Mathematics Indian Institute of Technology, Delhi Lecture No # 3 Solution of Nonlinear Algebraic Equations (Continued) (Refer Slide

More information

College of information technology Department of software

College of information technology Department of software University of Babylon Undergraduate: third class College of information technology Department of software Subj.: Application of AI lecture notes/2011-2012 ***************************************************************************

More information

Research on a Heuristic GA-Based Decision Support System for Rice in Heilongjiang Province

Research on a Heuristic GA-Based Decision Support System for Rice in Heilongjiang Province Research on a Heuristic GA-Based Decision Support System for Rice in Heilongjiang Province Ran Cao 1,1, Yushu Yang 1, Wei Guo 1, 1 Engineering college of Northeast Agricultural University, Haerbin, China

More information

A Service Revenue-oriented Task Scheduling Model of Cloud Computing

A Service Revenue-oriented Task Scheduling Model of Cloud Computing Journal of Information & Computational Science 10:10 (2013) 3153 3161 July 1, 2013 Available at http://www.joics.com A Service Revenue-oriented Task Scheduling Model of Cloud Computing Jianguang Deng a,b,,

More information

Experimental Design & Methodology Basic lessons in empiricism

Experimental Design & Methodology Basic lessons in empiricism Experimental Design & Methodology Basic lessons in empiricism Rafal Kicinger rkicinge@gmu.edu R. Paul Wiegand paul@tesseract.org ECLab George Mason University EClab - Summer Lecture Series p.1 Outline

More information

Comparison of Mamdani and TSK Fuzzy Models for Real Estate Appraisal

Comparison of Mamdani and TSK Fuzzy Models for Real Estate Appraisal Comparison of Mamdani and TSK Fuzzy Models for Real Estate Appraisal Dariusz Król, Tadeusz Lasota 2, Bogdan Trawiński, Krzysztof Trawiński 3, Wrocław University of Technology, Institute of Applied Informatics,

More information

Effect of Using Neural Networks in GA-Based School Timetabling

Effect of Using Neural Networks in GA-Based School Timetabling Effect of Using Neural Networks in GA-Based School Timetabling JANIS ZUTERS Department of Computer Science University of Latvia Raina bulv. 19, Riga, LV-1050 LATVIA janis.zuters@lu.lv Abstract: - The school

More information

Influence of the Crossover Operator in the Performance of the Hybrid Taguchi GA

Influence of the Crossover Operator in the Performance of the Hybrid Taguchi GA Influence of the Crossover Operator in the Performance of the Hybrid Taguchi GA Stjepan Picek Faculty of Electrical Engineering and Computing Unska 3, Zagreb, Croatia Email: stjepan@computer.org Marin

More information

Using Segment-based Genetic Algorithm with Local Search to Find Approximate Solution for Multi-Stage Supply Chain Network Design Problem

Using Segment-based Genetic Algorithm with Local Search to Find Approximate Solution for Multi-Stage Supply Chain Network Design Problem Çankaya University Journal of Science and Engineering Volume 10 (2013), No 2, 185-201. Using Segment-based Genetic Algorithm with Local Search to Find Approximate Solution for Multi-Stage Supply Chain

More information

CHAPTER 6 GENETIC ALGORITHM OPTIMIZED FUZZY CONTROLLED MOBILE ROBOT

CHAPTER 6 GENETIC ALGORITHM OPTIMIZED FUZZY CONTROLLED MOBILE ROBOT 77 CHAPTER 6 GENETIC ALGORITHM OPTIMIZED FUZZY CONTROLLED MOBILE ROBOT 6.1 INTRODUCTION The idea of evolutionary computing was introduced by (Ingo Rechenberg 1971) in his work Evolutionary strategies.

More information

Genetic Algorithm Performance with Different Selection Strategies in Solving TSP

Genetic Algorithm Performance with Different Selection Strategies in Solving TSP Proceedings of the World Congress on Engineering Vol II WCE, July 6-8,, London, U.K. Genetic Algorithm Performance with Different Selection Strategies in Solving TSP Noraini Mohd Razali, John Geraghty

More information

GENETIC ALGORITHM FORECASTING FOR TELECOMMUNICATIONS PRODUCTS

GENETIC ALGORITHM FORECASTING FOR TELECOMMUNICATIONS PRODUCTS 1 GENETIC ALGORITHM FORECASTING FOR TELECOMMUNICATIONS PRODUCTS STEPHEN D. SLOAN, RAYMOND W. SAW, JAMES J. SLUSS, JR., MONTE P. TULL, AND JOSEPH P. HAVLICEK School of Electrical & Computer Engineering

More information

Comparison of algorithms for automated university scheduling

Comparison of algorithms for automated university scheduling Comparison of algorithms for automated university scheduling Hugo Sandelius Simon Forssell Degree Project in Computer Science, DD143X Supervisor: Pawel Herman Examiner: Örjan Ekeberg CSC, KTH April 29,

More information

A Comparison of Genotype Representations to Acquire Stock Trading Strategy Using Genetic Algorithms

A Comparison of Genotype Representations to Acquire Stock Trading Strategy Using Genetic Algorithms 2009 International Conference on Adaptive and Intelligent Systems A Comparison of Genotype Representations to Acquire Stock Trading Strategy Using Genetic Algorithms Kazuhiro Matsui Dept. of Computer Science

More information

The effect of population history on the distribution of the Tajima s D statistic

The effect of population history on the distribution of the Tajima s D statistic The effect of population history on the distribution of the Tajima s D statistic Deena Schmidt and John Pool May 17, 2002 Abstract The Tajima s D test measures the allele frequency distribution of nucleotide

More information

The Effects of Start Prices on the Performance of the Certainty Equivalent Pricing Policy

The Effects of Start Prices on the Performance of the Certainty Equivalent Pricing Policy BMI Paper The Effects of Start Prices on the Performance of the Certainty Equivalent Pricing Policy Faculty of Sciences VU University Amsterdam De Boelelaan 1081 1081 HV Amsterdam Netherlands Author: R.D.R.

More information

Performance of Hybrid Genetic Algorithms Incorporating Local Search

Performance of Hybrid Genetic Algorithms Incorporating Local Search Performance of Hybrid Genetic Algorithms Incorporating Local Search T. Elmihoub, A. A. Hopgood, L. Nolle and A. Battersby The Nottingham Trent University, School of Computing and Technology, Burton Street,

More information

COMPARISON OF GENETIC OPERATORS ON A GENERAL GENETIC ALGORITHM PACKAGE HUAWEN XU. Master of Science. Shanghai Jiao Tong University.

COMPARISON OF GENETIC OPERATORS ON A GENERAL GENETIC ALGORITHM PACKAGE HUAWEN XU. Master of Science. Shanghai Jiao Tong University. COMPARISON OF GENETIC OPERATORS ON A GENERAL GENETIC ALGORITHM PACKAGE By HUAWEN XU Master of Science Shanghai Jiao Tong University Shanghai, China 1999 Submitted to the Faculty of the Graduate College

More information

Resource Allocation Schemes for Gang Scheduling

Resource Allocation Schemes for Gang Scheduling Resource Allocation Schemes for Gang Scheduling B. B. Zhou School of Computing and Mathematics Deakin University Geelong, VIC 327, Australia D. Walsh R. P. Brent Department of Computer Science Australian

More information

Fractal Images Compressing by Estimating the Closest Neighborhood with Using of Schema Theory

Fractal Images Compressing by Estimating the Closest Neighborhood with Using of Schema Theory Journal of Computer Science 6 (5): 591-596, 2010 ISSN 1549-3636 2010 Science Publications Fractal Images Compressing by Estimating the Closest Neighborhood with Using of Schema Theory Mahdi Jampour, Mahdi

More information

Evolutionary Computation

Evolutionary Computation Chapter 3 Evolutionary Computation Inspired by the success of nature in evolving such complex creatures as human beings, researchers in artificial intelligence have developed algorithms which are based

More information

A Fast Computational Genetic Algorithm for Economic Load Dispatch

A Fast Computational Genetic Algorithm for Economic Load Dispatch A Fast Computational Genetic Algorithm for Economic Load Dispatch M.Sailaja Kumari 1, M.Sydulu 2 Email: 1 Sailaja_matam@Yahoo.com 1, 2 Department of Electrical Engineering National Institute of Technology,

More information

A Fast and Elitist Multiobjective Genetic Algorithm: NSGA-II

A Fast and Elitist Multiobjective Genetic Algorithm: NSGA-II 182 IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 6, NO. 2, APRIL 2002 A Fast and Elitist Multiobjective Genetic Algorithm: NSGA-II Kalyanmoy Deb, Associate Member, IEEE, Amrit Pratap, Sameer Agarwal,

More information

Learning. Artificial Intelligence. Learning. Types of Learning. Inductive Learning Method. Inductive Learning. Learning.

Learning. Artificial Intelligence. Learning. Types of Learning. Inductive Learning Method. Inductive Learning. Learning. Learning Learning is essential for unknown environments, i.e., when designer lacks omniscience Artificial Intelligence Learning Chapter 8 Learning is useful as a system construction method, i.e., expose

More information

APPLICATION OF ADVANCED SEARCH- METHODS FOR AUTOMOTIVE DATA-BUS SYSTEM SIGNAL INTEGRITY OPTIMIZATION

APPLICATION OF ADVANCED SEARCH- METHODS FOR AUTOMOTIVE DATA-BUS SYSTEM SIGNAL INTEGRITY OPTIMIZATION APPLICATION OF ADVANCED SEARCH- METHODS FOR AUTOMOTIVE DATA-BUS SYSTEM SIGNAL INTEGRITY OPTIMIZATION Harald Günther 1, Stephan Frei 1, Thomas Wenzel, Wolfgang Mickisch 1 Technische Universität Dortmund,

More information

AUTOMATIC ADJUSTMENT FOR LASER SYSTEMS USING A STOCHASTIC BINARY SEARCH ALGORITHM TO COPE WITH NOISY SENSING DATA

AUTOMATIC ADJUSTMENT FOR LASER SYSTEMS USING A STOCHASTIC BINARY SEARCH ALGORITHM TO COPE WITH NOISY SENSING DATA INTERNATIONAL JOURNAL ON SMART SENSING AND INTELLIGENT SYSTEMS, VOL. 1, NO. 2, JUNE 2008 AUTOMATIC ADJUSTMENT FOR LASER SYSTEMS USING A STOCHASTIC BINARY SEARCH ALGORITHM TO COPE WITH NOISY SENSING DATA

More information

Removing the Genetics from the Standard Genetic Algorithm

Removing the Genetics from the Standard Genetic Algorithm Removing the Genetics from the Standard Genetic Algorithm Shumeet Baluja & Rich Caruana May 22, 1995 CMU-CS-95-141 School of Computer Science Carnegie Mellon University Pittsburgh, Pennsylvania 15213 This

More information

ECONOMIC GENERATION AND SCHEDULING OF POWER BY GENETIC ALGORITHM

ECONOMIC GENERATION AND SCHEDULING OF POWER BY GENETIC ALGORITHM ECONOMIC GENERATION AND SCHEDULING OF POWER BY GENETIC ALGORITHM RAHUL GARG, 2 A.K.SHARMA READER, DEPARTMENT OF ELECTRICAL ENGINEERING, SBCET, JAIPUR (RAJ.) 2 ASSOCIATE PROF, DEPARTMENT OF ELECTRICAL ENGINEERING,

More information

Author's personal copy

Author's personal copy Applied Soft Computing 11 (2011) 5652 5661 Contents lists available at ScienceDirect Applied Soft Computing j ourna l ho me p age: www.elsevier.com/l ocate/asoc A dynamic system model of biogeography-based

More information