USING THE EVOLUTION STRATEGIES' SELF-ADAPTATION MECHANISM AND TOURNAMENT SELECTION FOR GLOBAL OPTIMIZATION



Similar documents
A Novel Constraint Handling Strategy for Expensive Optimization Problems

Improved Particle Swarm Optimization in Constrained Numerical Search Spaces

Introduction To Genetic Algorithms

Peak Analysis for Characterizing Evolutionary Behavior.

Hiroyuki Sato. Minami Miyakawa. Keiki Takadama ABSTRACT. Categories and Subject Descriptors. General Terms

Simple Population Replacement Strategies for a Steady-State Multi-Objective Evolutionary Algorithm

Solving Three-objective Optimization Problems Using Evolutionary Dynamic Weighted Aggregation: Results and Analysis

Management of Software Projects with GAs

Agenda. Project Done Work In-progress Work Future Work

A Study of Local Optima in the Biobjective Travelling Salesman Problem

A Fast and Elitist Multiobjective Genetic Algorithm: NSGA-II

Model-based Parameter Optimization of an Engine Control Unit using Genetic Algorithms

Optimising Patient Transportation in Hospitals

A Brief Study of the Nurse Scheduling Problem (NSP)

Solving Engineering Optimization Problems with the Simple Constrained Particle Swarm Optimizer

The Influence of Binary Representations of Integers on the Performance of Selectorecombinative Genetic Algorithms

Adaptive Business Intelligence

ISSN: ISO 9001:2008 Certified International Journal of Engineering Science and Innovative Technology (IJESIT) Volume 2, Issue 3, May 2013

A New Multi-objective Evolutionary Optimisation Algorithm: The Two-Archive Algorithm

Large Scale Global Optimization: Experimental Results with MOS-based Hybrid Algorithms

THE TWO major steps in applying any heuristic search algorithm

Estimation of the COCOMO Model Parameters Using Genetic Algorithms for NASA Software Projects

ECONOMIC GENERATION AND SCHEDULING OF POWER BY GENETIC ALGORITHM

Evaluation of Crossover Operator Performance in Genetic Algorithms With Binary Representation

Lab 4: 26 th March Exercise 1: Evolutionary algorithms

Research on a Heuristic GA-Based Decision Support System for Rice in Heilongjiang Province

The ACO Encoding. Alberto Moraglio, Fernando E. B. Otero, and Colin G. Johnson

APPLICATION OF ADVANCED SEARCH- METHODS FOR AUTOMOTIVE DATA-BUS SYSTEM SIGNAL INTEGRITY OPTIMIZATION

Expert Systems with Applications

Alpha Cut based Novel Selection for Genetic Algorithm

A Service Revenue-oriented Task Scheduling Model of Cloud Computing

UWE has obtained warranties from all depositors as to their title in the material deposited and as to their right to deposit such material.

A Fast Computational Genetic Algorithm for Economic Load Dispatch

Evolutionary Prefetching and Caching in an Independent Storage Units Model

NAIS: A Calibrated Immune Inspired Algorithm to solve Binary Constraint Satisfaction Problems

Dynamic Memory Model for Non-Stationary Optimization

ESQUIVEL S.C., GATICA C. R., GALLARD R.H.

D-optimal plans in observational studies

Numerical Research on Distributed Genetic Algorithm with Redundant

APPLICATION OF GENETIC ALGORITHMS IN INVENTORY MANAGEMENT

New binary representation in Genetic Algorithms for solving TSP by mapping permutations to a list of ordered numbers

Learning in Abstract Memory Schemes for Dynamic Optimization

Multi-Objective Optimization using Evolutionary Algorithms

Modified Version of Roulette Selection for Evolution Algorithms - the Fan Selection

On Parameter Tuning in Search Based Software Engineering

Least Squares Estimation

Resource-constrained Scheduling of a Real Project from the Construction Industry: A Comparison of Software Packages for Project Management

HYBRID GENETIC ALGORITHMS FOR SCHEDULING ADVERTISEMENTS ON A WEB PAGE

International Journal of Advanced Research in Electrical, Electronics and Instrumentation Engineering

Optimizing the Dynamic Composition of Web Service Components

CHAPTER 3 SECURITY CONSTRAINED OPTIMAL SHORT-TERM HYDROTHERMAL SCHEDULING

Solving LEGO brick layout problem using Evolutionary Algorithms

Chapter 1 Global Optimization in Supply Chain Operations

NEUROEVOLUTION OF AUTO-TEACHING ARCHITECTURES

Multi-objective Approaches to Optimal Testing Resource Allocation in Modular Software Systems

An evolutionary learning spam filter system

Test Problem Construction for Single-Objective Bilevel Optimization

THE main goal of the research described in this paper

Genetic Algorithms for Bridge Maintenance Scheduling. Master Thesis

International Journal of Software and Web Sciences (IJSWS)

Genetic Algorithm. Based on Darwinian Paradigm. Intrinsically a robust search and optimization mechanism. Conceptual Algorithm

A REINFORCED EVOLUTION-BASED APPROACH TO MULTI-RESOURCE LOAD BALANCING

High-performance local search for planning maintenance of EDF nuclear park

IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 13, NO. 2, APRIL

Performance of Hybrid Genetic Algorithms Incorporating Local Search

An Exonic Genetic Algorithm with RNA Editing Inspired Repair Function for the Multiple Knapsack Problem

A Multi-Objective Performance Evaluation in Grid Task Scheduling using Evolutionary Algorithms

Evolutionary denoising based on an estimation of Hölder exponents with oscillations.

Proceedings of the 2012 Winter Simulation Conference C. Laroque, J. Himmelspach, R. Pasupathy, O. Rose, and A.M. Uhrmacher, eds

Results of the GECCO 2011 Industrial Challenge: Optimizing Foreign Exchange Trading Strategies

A Forecasting Decision Support System

Binary Affinity Genetic Algorithm

Using Ant Colony Optimization for Infrastructure Maintenance Scheduling

A Stock Trading Algorithm Model Proposal, based on Technical Indicators Signals

Original Article Efficient Genetic Algorithm on Linear Programming Problem for Fittest Chromosomes

Un algorithme génétique hybride à gestion adaptative de diversité pour le problème de tournées de véhicules et ses variantes

New Modifications of Selection Operator in Genetic Algorithms for the Traveling Salesman Problem

Advanced Techniques for Solving Optimization Problems through Evolutionary Algorithms

A Novel Binary Particle Swarm Optimization

Core problems in the bi-criteria {0,1}-knapsack: new developments

Research Article A Novel Resource-Leveling Approach for Construction Project Based on Differential Evolution

Using Genetic Algorithms for Data Mining Optimization in an Educational Web-based System

Computer Handholders Investment Software Research Paper Series TAILORING ASSET ALLOCATION TO THE INDIVIDUAL INVESTOR

EVOLUTIONARY PROBLEM SOLVING. Thomas Philip Runarsson

On the Empirical Evaluation of Las Vegas Algorithms Position Paper

An empirical comparison of selection methods in evolutionary. algorithms. Peter J.B.Hancock. Department of Psychology. May 25, 1994.

Solving Method for a Class of Bilevel Linear Programming based on Genetic Algorithms

Big Data: A Geometric Explanation of a Seemingly Counterintuitive Strategy

DRAFT. Finding and Proving the Optimum: Cooperative Stochastic and Deterministic Search

Genetic algorithms for solving portfolio allocation models based on relative-entropy, mean and variance

Genetic local search with distance preserving recombination operator for a vehicle routing problem

Selection Procedures for Module Discovery: Exploring Evolutionary Algorithms for Cognitive Science

A joint control framework for supply chain planning

A Multi-objective Genetic Algorithm for Employee Scheduling

A Comparison of Genotype Representations to Acquire Stock Trading Strategy Using Genetic Algorithms

A Multi-Objective Extremal Optimisation Approach Applied to RFID Antenna Design

PLAANN as a Classification Tool for Customer Intelligence in Banking

Fleet Assignment Using Collective Intelligence

Highway Maintenance Scheduling Using Genetic Algorithm with Microscopic Traffic Simulation

Privacy-Aware Scheduling for Inter-Organizational Processes

Transcription:

1 USING THE EVOLUTION STRATEGIES' SELF-ADAPTATION MECHANISM AND TOURNAMENT SELECTION FOR GLOBAL OPTIMIZATION EFRÉN MEZURA-MONTES AND CARLOS A. COELLO COELLO Evolutionary Computation Group at CINVESTAV-IPN (EVOCINV) Depto. de Ingeniería Eléctrica Sección de Computación Av. IPN No. 2508, Col. San Pedro Zacatenco, México D.F. 07300, MEXICO, emezura@computacion.cs.cinvestav.mx ccoello@cs.cinvestav.mx ABSTRACT An approach based on a (µ+1)-es and three simple tournament rules is proposed to solve global optimization problems. The proposed approach does not use a penalty function and does not require any extra parameters other than the original parameters of an evolution strategy. This approach is validated with respect to the state-of-the-art techniques in evolutionary constrained optimization using a well-known benchmark. The results obtained are very competitive with respect to the approaches against which our approach was compared. INTRODUCTION Having a strong theoretical background (Schwefel, 1995; Bäck, 1996; Beyer, 2000), Evolution Strategies (ES) have been found efficient in solving a wide variety of optimization problems (Asselmeyer, 1997). However, as other evolutionary Algorithms (EAs), ES lack an explicit mechanism to deal with constrained search spaces. Several approaches to incorporate constraints into the fitness function of an EA (Michalewicz, 1996; Coello, 2002) have been proposed. The most common approach used to incorporate the constraints of the problem to the fitness function of an EA is the use of penalty functions, where the amount of constraint violation is used to punish or penalize an infeasible solution so that feasible solutions are favored by the selection process. Without concerning their simplicity, they have many drawbacks from which the main one is that they require a careful fine tuning of the penalty factors that accurately estimates the degree of penalization to be applied so that we can approach efficiently the feasible region (Smith, 1997; Coello 2002). In this paper, we argue that the self-adaptation mechanism of a conventional evolution strategy combined with some (very simple) tournament rules based on feasibility can provide us with a highly competitive evolutionary algorithm for constrained optimization.

2 COMPARING DIFFERENT TYPES OF ES The motivation of this work is based in the fact that some of the most recent and competitive approaches to incorporate constraints into an evolutionary algorithm use an ES (see for example (Runarsson, 2000; Hamida, 2002)). We then hypothesized that the self-adaptation mechanis of an ES plays an important role when solving numerical optimization problems. In order to verify our hypothesis, we performed a small comparative study among different types of ES. In order to determine the most useful type of ES to deal with constrained spaces, we performed a numerical comparison using five different types of ES. Both a (µ+λ)-es and a (µ,λ)-es were implemented with and without correlated mutation. Additionally, we also implemented a (µ+1)-es using the 1/5-success rule. We decided to adopt a very simple constraint-handling approach which shares similarities with some previous proposals (e.g., (Jimenez, 1999; Deb, 2000)). Note however, that all of these previous proposals require an additional mechanism to maintain diversity in the population. In our proposal, however, no extra mechanisms are provided to maintain diversity. The tournament rules adopted in the five types of ES implemented are the following: Between 2 feasible solutions, the one with the highest fitness value wins. If one solution is feasible and the other one is infeasible, the feasible solution wins. If both solutions are infeasible, the one with the lowest sum of constraint violation is preferred. We decided to use ten (out of 13) of the test functions described in (Runarsson, 2000) to evaluate our five types of ES. The test functions chosen contain characteristics that are representative of what can be considered difficult global optimization problems for an evolutionary algorithm. The expressions of the test functions can be seen in (Mezura-Montes, 2003). TF optimal Best Mean Median Worst St. Dev. g01-15 -15-14.8486-14.998-12.9999 0.410082 g02 0.803619 0.793083 0.698932 0.708804 0.576079 0.062927 g03 1 1 1 1 1 0.000014 g04-30665.539-30665.539-30665.442-30665.539-30663.496 0.393918 g06-6961.814-6961.814-6961.814-6961.814-6961.814 0.00 g07 24.306 24.3681 24.7025 24.7307 25.5167 0.242956 g08 0.095825 0.095825 0.095825 0.095825 0.095825 0.00 g09 680.63 680.6317 680.6736 680.6593 680.9151 0.052483 g11 0.75 0.75 0.7844 0.776296 0.8795 0.037345 g12 1 1 1 1 1 0.00 Table 1. Statistical Results of the (µ+1)-es after the first set of experiments. We set a total of 350000 fitness function evaluations and performed 30 runs for each problem and for each type of ES. Equality constraints were transformed into inequalities using a tolerance value of 0.0001. For the (µ+1)-es the initial values are: σ=4.0, C=0.99, µ=5, and maximum number of generations = 350000. For the (µ+λ)-es and (µ,λ)-es, we adopted panmictic discrete recombination both for the strategy parameters and for the decision variables. The learning rates values were calculated as indicated in (Schwefel, 1995). The initial values for the standard deviations were 3.0 for all the decision variables. The initial values for the remaining ES are: µ=100, λ=300, and maximum number of generations = 1166.

3 Due to space limitations we only show the statistical results of the (µ+1)-es in Table 1 (the complete statistical results can be found in (Mezura-Montes, 2003)), which had the best overall performance, both in terms of the best solution found and in terms of its statistical measures. We found this a little bit surprising, because this is the most simple type of ES implemented in our comparative study. Based on the encouraging results that we obtained from this study, we decided to compare our (µ+1)-es with respect to other approaches. The details of the selected approach (we decided to call it Simple Evolution Strategy, or SES) are in the pseudo-code shown in Figure 1. We adopted the 1/5-success rule to self-adapt the only sigma value used in the algorithm. At each generation µ individuals are created but none of them is evaluated. In fact, we only evaluate the offspring created by the union of the mutated individuals produced. The low computational cost of the approach is owing to the fact that it requires only one sigma value to adapt and one fitness function evaluation per generation. Begin t=0 Create a random initial solution x 0 Evaluate f(x 0 ) For t=1 to MAX_GENERATIONS Do Produce µ mutations of x t-1 using: x i j =x i t-1 +σ [t] N i (0,1) forall i n, j=1,2,, µ Generate one child x c by the combination of the µ mutations using m=randint(1, µ) x i c = x i m, forall i n Evaluate f(x c ) Apply comparison criteria to select the best individual x t between x t-1 and x c t=t+1 If (t mod n = 0) Then If (p s >1/5) Then σ [t]= σ[t-n]/c Else If (p s <1/5) Then σ [t]= σ[t-n] c Else If (p s =1/5) Then σ [t]= σ[t-n] End For End Figure 1. SES algorithm (n is the number of decision variables of the problem, ps is the success rate of mutations in order to apply the 1/5 success rule )

4 It is worth emphasizing that our algorithm is based on two mechanisms: (1) the natural self-adaptative mechanism of the ES, which helps the approach to sample the search space well enough as to reach the feasible region reasonably fast and (2) the comparison criteria of our tournaments, which select the most promising solution in order to be used as a new starting point for the search. COMPARING SES AGAINST OTHER APPROACHES In the second part of our experiment, we compared the performance of our approach with respect to three techniques that are representative of the state-ofthe-art in the area: the homomorphous maps HM (Koziel, 1999), the stochastic ranking SR (Runarsson, 2000) and the adaptive segregational constraint handling evolutionary algorithm ASCHEA (Hamida, 2002). In Tables 2, 3 and 4, we show the results of the comparison. Best Mean Worst TF optimal SES HM SES HM SES HM g01-15 -15-14.7886-14.8486-14.7082-12.9999-14.6154 g02 0.803619 0.793083 0.79953 0.698932 0.79671 0.576079 0.79119 g03 1 1 0.9997 1 0.9989 1 0.9978 g04-30665.539-30665.539-30664.5-30665.442-30655.3-30663.496-30645.9 g06-6961.814-6961.814-6952.1-6961.814-6342.6-6961.814-5473.9 g07 24.306 24.3681 24.62 24.7025 24.826 25.5167 25.069 g08 0.095825 0.095825 0.095825 0.095825 0.08916 0.095825 0.0291438 g09 680.63 680.6317 680.91 680.6736 681.16 680.9151 683.18 g11 0.75 0.75 0.75 0.7844 0.75 0.8795 0.75 g12 1 1 0.99999 1 0.99913 1 0.99195 Table 2. Comparison of results between our approach (SES) and the Homomorphous Maps (HM) Best Mean Worst TF optimal SES SR SES SR SES SR g01-15 -15-15 -14.8486-15 -12.9999-15 g02 0.803619 0.793083 0.803515 0.698932 0.781975 0.576079 0.726288 g03 1 1 1 1 1 1 1 g04-30665.539-30665.539-30665.539-30665.442-30665.539-30663.496-30665.539 g06-6961.814-6961.814-6961.814-6961.814-6875.940-6961.814-6350.262 g07 24.306 24.3681 24.307 24.7025 24.374 25.5167 24.642 g08 0.095825 0.095825 0.095825 0.095825 0.095825 0.095825 0.095825 g09 680.63 680.6317 680.63 680.6736 680.656 680.9151 680.763 g11 0.75 0.75 0.75 0.7844 0.75 0.8795 0.75 g12 1 1 1 1 1 1 1 Table 3. Comparison of results between our approach (SES) and the Stochastic Ranking (SR)

5 Best Mean Worst TF optimal SES ASCHEA SES ASCHEA SES ASCHEA g01-15 -15-15 -14.8486-14.84-12.9999 NA g02 0.803619 0.793083 0.785 0.698932 0.59 0.576079 NA g03 1 1 1 1 0.9999 1 NA g04-30665.539-30665.539-30665.5-30665.442-30665.5-30663.496 NA g06-6961.814-6961.814-6961.81-6961.814-6961.81-6961.814 NA g07 24.306 24.3681 24.3323 24.7025 24.66 25.5167 NA g08 0.095825 0.095825 0.095825 0.095825 0.095825 0.095825 NA g09 680.63 680.6317 680.63 680.6736 680.641 680.9151 NA g11 0.75 0.75 0.75 0.7844 0.75 0.8795 NA g12 1 1 NA 1 NA 1 NA Table 4. Comparison of results between our approach (SES) and the Adaptive Segregational Constraint Handling Evolutionary Algorithm (ASCHEA) (NA means not available). There are several issues derived from this comparison that deserve some discussion: Our SES was able to converge to the global optimum in 8 of the test 10 functions used (g01, g03, g04, g06, g08, g09, g11 and g12), and it was able to converge very closely to the optimum in the other two.also, with respect to the homomorphous maps (see Table 2), SES converged to a better best solution in 7 problems, and it obtained better average solutions in 8 problems. It also found a better worst solution in 6 problems. Thus, it should be clear that SES had a highly competitive performance, even improving the results of the homomorphous maps in some test functions. With respect to stochastic ranking (see Table 3), SES was able to converge to similar best solutions in 8 problems and it obtained similar average and worst results in 2 problems. SES found a better average and worst result only in one problem (g06). Finally, with respect to ASCHEA (see Table 4), the SES converged to similar best results in 7 problems and found better best results in problem g02. SES also obtained better average results in 3 problems and matched ASCHEA's average results in three more. DISCUSSION OF RESULTS The results described before indicate a competitive performance of our approach with respect to three techniques representative of the state-of-the-art in constrained evolutionary optimization. Besides being a very simple approach, it is worth reminding that SES does not require any extra parameters (besides those used with an evolution strategy). In contrast, the homomorphous maps require an additional parameter (called v) which has to be found empirically (Koziel, 1999). Stochastic ranking requires the definition of a parameter called P f, whose value has an important impact on the performance of the approach (Runarsson, 2000). ASCHEA also requires the definition of several extra parameters and in its latest version, it uses niching, which is a process that also has at least one additional parameter (Hamida, 2002). Finally, the number of fitness funtion evaluations of our approach is less or equal to that requiered by the compared approaches: Stochastic ranking performed the same number of evaluations as SES (350000), but the homomorphous maps performed 1,400,000 fitness function evaluations, and ASCHEA performed 1,500,000 fitness function evaluations.

6 CONCLUSIONS AND FUTURE WORK The self-adaptation mechanism of an evolution strategy combined with a set of simple tournament rules was used to deal with constrained search spaces. Based on the results obtained, we claim that the proposed approach is a viable alternative for constrained optimization. Its main advantage is that it does not require a penalty function or any extra parameters (other than the original parameters of an evolution strategy) to bias the search towards the feasible region of a problem. Additionally, our approach has a low computational cost and it is easy to implement. Our future work consists on experimenting with a different random numbers distribution (a Gaussian distribution was used in this paper) to avoid convergence to local optima, since this happens in some cases. ACKNOWLEDGMENTS The first author acknowledges support from the mexican Consejo Nacional de Ciencia y Tecnología (CONACyT) through a scholarship to pursue graduate studies at CINVESTAV-IPN s Electrical Engineering Department. The second author acknowledges support from CONACyT through project No. 32999-A. REFERENCES Asselmayer, T., Ebeling, W., and Rosé, H., 1997, Evolutionary strategies of optimization., Phys. Rev. E, 56(1):1171--1180, July. Bäck, T., 1996, Evolutionary Algorithms in Theory and Practice., Oxford University Press, New York. Beyer, H.G., 2001, The theory of Evolution Strategies., Springer, Berlin. Coello Coello, C.A., 2002, Theoretical and Numerical Constraint Handling Techniques used with Evolutionary Algorithms: A Survey of the State of the Art., Computer Methods in Applied Mechanics and Engineering, 191(11-12):1245--1287, January. Deb, K., 2000, An Efficient Constraint Handling Method for Genetic Algorithms., Computer Methods in Applied Mechanics and Engineering, 186(2/4):311 338. Hamida, S.B. and Schoenauer, M., 2002, (ASCHEA): New Results Using Adaptive Segregational Constraint Handling., In Proceedings of the Congress on Evolutionary Computation 2002 (CEC'2002), volume 1, pages 884--889, Piscataway, New Jersey, May, IEEE Service Center. Jiménez, F. and Verdegay, J.L., 1999, Evolutionary techniques for constrained optimization problems., In Hans-Jurgen Zimmermann, editor, 7th European Congress on Intelligent Techniques and Soft Computing (EUFIT'99), Aachen, Germany, Verlag Mainz. ISBN 3-89653-808-X. Koziel, S. and Michalewicz, Z., 1999, Evolutionary Algorithms, Homomorphous Mappings, and Constrained Parameter Optimization., Evolutionary Computation, 7(1):19--44. Mezura-Montes, E. and Coello Coello, C.A., 2003, An Empirical Study about the Usefulness of the Evolution Strategies Natural Self-Adaptive Mechanism to Handle Constraints in Global Optimization., Technical Report EVOCINV-01-2003, Evolutionary Computation Group at CINVESTAV, Sección de Computación, Departamento de Ingeniería Eléctrica, CINVESTAV- IPN, México D.F., México. Available at http://www.cs.cinvestav.mx/~constraint Michalewicz, Z. and Schoenauer, M., 1996, Evolutionary Algorithms for Constrained Parameter Optimization Problems. Evolutionary Computation, 4(1):1--32. Runarsson, T.P. and Yao, X., 2000, Stochastic Ranking for Constrained Evolutionary Optimization. IEEE Transactions on Evolutionary Computation, 4(3):284--294, September. Schwefel, H.P., 1995, Evolution and Optimal Seeking, John Wiley & Sons Inc., New York. Smith, A.E. and Coit, D.W., 1997, Constraint Handling Techniques---Penalty Functions. In Thomas Back, David B. Fogel, and Zbigniew Michalewicz, editors, Handbook of Evolutionary Computation, chapter C 5.2. Oxford University Press and Institute of Physics Publishing.