A Binary Model on the Basis of Imperialist Competitive Algorithm in Order to Solve the Problem of Knapsack 1-0



Similar documents
CLOUD DATABASE ROUTE SCHEDULING USING COMBANATION OF PARTICLE SWARM OPTIMIZATION AND GENETIC ALGORITHM

BMOA: Binary Magnetic Optimization Algorithm

An ACO Approach to Solve a Variant of TSP

Numerical Research on Distributed Genetic Algorithm with Redundant

A Multi-Objective Performance Evaluation in Grid Task Scheduling using Evolutionary Algorithms

International Journal of Emerging Technologies in Computational and Applied Sciences (IJETCAS)

A Parallel Processor for Distributed Genetic Algorithm with Redundant Binary Number

Wireless Sensor Networks Coverage Optimization based on Improved AFSA Algorithm

A Genetic Algorithm Approach for Solving a Flexible Job Shop Scheduling Problem

Minimizing Response Time for Scheduled Tasks Using the Improved Particle Swarm Optimization Algorithm in a Cloud Computing Environment

APPLICATION OF ADVANCED SEARCH- METHODS FOR AUTOMOTIVE DATA-BUS SYSTEM SIGNAL INTEGRITY OPTIMIZATION

A Service Revenue-oriented Task Scheduling Model of Cloud Computing

Optimization of PID parameters with an improved simplex PSO

Projects - Neural and Evolutionary Computing

An ant colony optimization for single-machine weighted tardiness scheduling with sequence-dependent setups

Journal of Optimization in Industrial Engineering 13 (2013) 49-54

A Novel Binary Particle Swarm Optimization

Optimal Tuning of PID Controller Using Meta Heuristic Approach

ACO Based Dynamic Resource Scheduling for Improving Cloud Performance

HYBRID ACO-IWD OPTIMIZATION ALGORITHM FOR MINIMIZING WEIGHTED FLOWTIME IN CLOUD-BASED PARAMETER SWEEP EXPERIMENTS

A hybrid Approach of Genetic Algorithm and Particle Swarm Technique to Software Test Case Generation

Web Service Selection using Particle Swarm Optimization and Genetic Algorithms

A Performance Comparison of GA and ACO Applied to TSP

Holland s GA Schema Theorem

A New Quantitative Behavioral Model for Financial Prediction

Journal of Theoretical and Applied Information Technology 20 th July Vol.77. No JATIT & LLS. All rights reserved.

A Hybrid Tabu Search Method for Assembly Line Balancing

Production Scheduling for Dispatching Ready Mixed Concrete Trucks Using Bee Colony Optimization

Evaluation of Different Task Scheduling Policies in Multi-Core Systems with Reconfigurable Hardware

Improved PSO-based Task Scheduling Algorithm in Cloud Computing

Genetic Algorithm Based Interconnection Network Topology Optimization Analysis

Ant Colony Optimization (ACO)

A GENETIC ALGORITHM FOR RESOURCE LEVELING OF CONSTRUCTION PROJECTS

Overview. Swarms in nature. Fish, birds, ants, termites, Introduction to swarm intelligence principles Particle Swarm Optimization (PSO)

EA and ACO Algorithms Applied to Optimizing Location of Controllers in Wireless Networks

Optimal PID Controller Design for AVR System

COMPUTATIONAL INTELLIGENCE BASED ON THE BEHAVIOR OF CATS. Shu-Chuan Chu. Pei-Wei Tsai. Received July 2006; revised October 2006

International Journal of Software and Web Sciences (IJSWS)

D A T A M I N I N G C L A S S I F I C A T I O N

Ant Colony Optimization and Constraint Programming

Parallel Simulated Annealing Algorithm for Graph Coloring Problem

A Hybrid Model of Particle Swarm Optimization (PSO) and Artificial Bee Colony (ABC) Algorithm for Test Case Optimization

Memory Allocation Technique for Segregated Free List Based on Genetic Algorithm

Alpha Cut based Novel Selection for Genetic Algorithm

Research on the Performance Optimization of Hadoop in Big Data Environment

Performance Evaluation of Task Scheduling in Cloud Environment Using Soft Computing Algorithms

Flexible Neural Trees Ensemble for Stock Index Modeling

GA as a Data Optimization Tool for Predictive Analytics

A resource schedule method for cloud computing based on chaos particle swarm optimization algorithm

Dynamic Task Scheduling with Load Balancing using Hybrid Particle Swarm Optimization

STUDY OF PROJECT SCHEDULING AND RESOURCE ALLOCATION USING ANT COLONY OPTIMIZATION 1

Manjeet Kaur Bhullar, Kiranbir Kaur Department of CSE, GNDU, Amritsar, Punjab, India

Evolutionary Detection of Rules for Text Categorization. Application to Spam Filtering

ISSN: ISO 9001:2008 Certified International Journal of Engineering Science and Innovative Technology (IJESIT) Volume 2, Issue 3, May 2013

University of British Columbia Co director s(s ) name(s) : John Nelson Student s name

The Meta-Heuristic Binary Shuffled Frog Leaping and Genetic Algorithms in Selecting Efficient Significant Features

A Graphical Visualization Tool for Analyzing the Behavior of Metaheuristic Algorithms

COMPUTATIONIMPROVEMENTOFSTOCKMARKETDECISIONMAKING MODELTHROUGHTHEAPPLICATIONOFGRID. Jovita Nenortaitė

Experiences With Teaching Adaptive Optimization to Engineering Graduate Students

A Study of Crossover Operators for Genetic Algorithm and Proposal of a New Crossover Operator to Solve Open Shop Scheduling Problem

Hybrid Algorithm using the advantage of ACO and Cuckoo Search for Job Scheduling

Genetic Algorithm. Based on Darwinian Paradigm. Intrinsically a robust search and optimization mechanism. Conceptual Algorithm

Comparing Algorithms for Search-Based Test Data Generation of Matlab R Simulink R Models

Research Article Service Composition Optimization Using Differential Evolution and Opposition-based Learning

Demand Forecasting Optimization in Supply Chain

IMPROVING PERFORMANCE OF RANDOMIZED SIGNATURE SORT USING HASHING AND BITWISE OPERATORS

Estimation of the COCOMO Model Parameters Using Genetic Algorithms for NASA Software Projects

International Journal of Scientific Research Engineering & Technology (IJSRET)

Evolutionary SAT Solver (ESS)

Comparison of Major Domination Schemes for Diploid Binary Genetic Algorithms in Dynamic Environments

Software Framework for Vehicle Routing Problem with Hybrid Metaheuristic Algorithms

LOAD BALANCING TECHNIQUES

Genetic algorithms for changing environments

Biogeography Based Optimization (BBO) Approach for Sensor Selection in Aircraft Engine

An Improved ACO Algorithm for Multicast Routing

A Robust Method for Solving Transcendental Equations

TEST CASE SELECTION & PRIORITIZATION USING ANT COLONY OPTIMIZATION

Solving the Vehicle Routing Problem with Genetic Algorithms

Improved Particle Swarm Optimization in Constrained Numerical Search Spaces

An evolutionary learning spam filter system

The Applications of Genetic Algorithms in Stock Market Data Mining Optimisation

Inventory Analysis Using Genetic Algorithm In Supply Chain Management

Software Project Planning and Resource Allocation Using Ant Colony Optimization with Uncertainty Handling

AN APPROACH FOR SOFTWARE TEST CASE SELECTION USING HYBRID PSO

An Efficient Approach for Task Scheduling Based on Multi-Objective Genetic Algorithm in Cloud Computing Environment


Approach of software cost estimation with hybrid of imperialist competitive and artificial neural network algorithms

Genetic Algorithms and Sudoku

Introduction To Genetic Algorithms

Genetic Algorithm Performance with Different Selection Strategies in Solving TSP

Management Science Letters

Spare Parts Inventory Model for Auto Mobile Sector Using Genetic Algorithm

A Reactive Tabu Search for Service Restoration in Electric Power Distribution Systems

Genetic Algorithm Evolution of Cellular Automata Rules for Complex Binary Sequence Prediction

Transcription:

212 International Conference on System Engineering and Modeling (ICSEM 212) IPCSIT vol. 34 (212) (212) IACSIT Press, Singapore A Binary Model on the Basis of Imperialist Competitive Algorithm in Order to Solve the Problem of Knapsack 1- Shirin Nozarian *a, Hodais Soltanpoor a, Majid Vafaei Jahan b a Young Researchers Club, Mashhad Branch, Islamic Azad University, Mashhad, Iran b Islamic Azad University,Mashhad Branch,Iran. * snozarian@yahoo.com Abstract. Though imperialist competitive algorithm is repeatedly utilized to solve different problems, its discrete binary model has been ignored. A discrete method is offered in this article according to Imperialist competitive algorithm to solve the problem of knapsack 1-. Resulted outcomes of utilizing this method represent the ability of this algorithm in solving such problems. In order to evaluate the efficiency of it, this method is compared with some heuristic algorithms such as Genetic algorithm, Particles swarm optimization, Ant colony system, Tabu search, and simulated annealing. Discrete binary Imperialist competitive algorithm offered better suggestive responses in all of mentioned tests. Results cleared that this algorithm performs around 3% better than other compared methods. Keywords: Discrete Problems, Knapsack Problem, Imperialist Competitive Algorithm, Genetic Algorithm, Mimetic Algorithm, Particles Swarm Optimization, Ant Colony System, Tabu Search, Simulated Annealing 1. Introduction Imperialist competitive algorithm (short of ICA ) was introduced by Atashpaz et al. in [1, 2] in 27. Then it was utilized for optimum design of skeletal structures by Kaveh et al. in [3]. ICA is an initial collection of possible responses, exactly the same as other algorithms in this group, these initial responses are known as "chromosome" in the genetic algorithm (short of GA ), as "particle" in Particles swarm optimization (short of PSO ), and as "country" in ICA one. ICA gradually improves these initial responses (countries) in a special process that is explained in [1]- and finally responds to the problem of optimization (favorite country) properly [1]. A discrete binary method is offered in this article for mentioned algorithm which will be explained in following parts. On the other hand, GA was introduced by Holland in early 198s[4]. This method solved most of optimization problems, especially binary ones, successfully. Though GA performed well in solving optimization problems, its efficiency is less than local searches as it is explained in [5]. PSO performed well similarly in solving optimization problems. Generally movement of particle swarms is an efficient technique to solve the optimization problems. It works according to the probability rules and population. The binary sample of this method was introduced by Kennedy and Eberhart in [6]. Kennedy and Spears engaged in comparing GA and PSO in [7]. Results of this research showed that PSO performs more flexible than GA and is faster than it in convergence to the response. Bak and Snappen offered a model in 1993 which was the basis of a new method, called Extermal optimization (short of EO ). This model is according to Bak- Sneppen model [8]. On the contrary to GA that solves the problems by applying a population of possible responses, EO looks for the response only by mutating of a solution [9]. The initial model of this method is offered in [1]. The mutation operator in this model was performed on the worst parts only. This method was able to produce just approximate optimization responses, and finding the main response was hardly probable in it. Hence in later studies, Boettcher and Percus defined the probability of mutation for each component. This increased the efficiency of the model greatly [11]. 13

Moser and Hendtlass compared EO and Ant colony optimization (short of ACO ) in [9]. Their evaluations showed that EO performs faster and more flexible than ACO in convergence to the response, but it was fluctuating in response. It is hardly probable to achieve a global optimum by this pattern. Tabu search (short of TS ) is another method which is compared in this article. This method was introduced by Glover and Laguna in 1997. They proved its efficiency in [12] for solving different problems. But due to its need to a big memory, methods of optimizing the memory had to be added to it. It is explained in [13] that if TS executes properly, it will be able to find a global optimum. But finding no optimal response is probable in this method, so an exact time of complication cannot be defined for it. Simulated annealing (short of SA ) is another algorithm which is compared in this research. A new viewpoint was introduced by Metropolis et al. in 1953 to solve the problem of compound optimization, this method was called SA. Fundamentally, their algorithm is a kind of simulation in a system whose temperature is constantly decreasing until it gets to a stable (freezing) condition [14]. Kirkpatrick et al. and Cerny used this idea in [15, 16] to solve the compound problems like "Traveling salesman". Thereafter SA is used to solve different problems, including knapsack -1. Incapability in escaping from local optimum is the most important short come in this method. An extensive definition of the knapsack problem is presented in the second part of the article. Explaining the ICA is offered in third, suggested discrete binary model is offered in fourth, tests and comparing the algorithms to solve the mentioned problem is offered in fifth part of this article. Finally, some results and suggestions for future works are added to section six. 2. An Explanatory Definition Of The Problem Problem of Knapsack 1-: A knapsack and some packs are supposed to be in this problem. Every pack possesses a specific weight and value. On the other hand the knapsack has a specific volume, too. It means that definite number of packs can be put in it. We want to put the packs into the knapsack in a special way to have the maximum value of them [17]. Mathematical Formulation of the Problem: Maximize:,,1 (1) In order to:,,1 (2) It should be mentioned that the problem of knapsack 1- is an NP-Complete Problem [18]. According to the algorithm which is used to find an exact solution for this problem, its time consequence is expressed as follow: 2 (3) Solving this problem would be hard(and even impossible) at the polynomial time, when the number of (n)s is getting big. 3. Imperialist Competitive Algorithm Like other evolutionary optimization methods, this method begins with some initial population. Every element of the population is called a country in this algorithm. Countries are classified in two different groups: Colony and Imperialist. Every colonialist brings some colonies under its dominion and controls them according to its power [1]. Assimilation policy and imperialist competition is the core of this algorithm. This policy is accomplished by moving the colonies towards empires on the basis of a determined connection which is defined completely in [17]. Generally, imperialist competitive algorithm is unlimitedly used to call every continues optimization problem. Therefore this algorithm is used easily in different fields, such as electrical engineering, mechanics, industries, management, civic developments, artificial intelligence, etc. 4. Discrete Binary Imperialist Competitive Algorithm (Db-Ica) 131

However imperialist competitive algorithm was introduced to solve the problem of optimization, its binary model is ignored for solving the discrete problems. Two different groups of parameters can be classified in imperialist competitive algorithm: Independent Constructional operators: These operators do not depend on the kind of coding in the problem. They usually have stable and fixed construction. For example, all decisions are made about the costs of countries and empires in imperialist competition part and kind of their coding or kind of performing the countries are not interfered in the process. Dependent operators: Such operators should be designed dependent on the problem. For example, the operator of assimilation in a connected problem is as follow: the colony moves toward the imperialist in a specific trend and specific vector. The colony approaches to the imperialist by this movement in a Euclidean way. But the Euclidean approach in a discrete problem does not mean approach necessarily. Euclidean approach is absolutely meaningless in some samples. Only operators and related parts to the problem should change in a imperialist competitive algorithm to solve a discrete problem. These related parts are as follow: 1-creating the initial country, 2-the assimilation operator, and 3-the revolution operator. Created changes in creating the initial countries depend on the kind of discrete problem. Discrete problems can be classified in three different groups: 1-discrete problems with continues nature, 2-permutation discrete problems and 3-binary discrete problems. There is no need to big changes in the algorithm in discrete problems with connected nature. But several functions should change in two other groups. The binary knapsack problem classifies in third group. In this condition, the function which creates initial countries should be defined in a special way, in that created initial countries are in conformity with the favorite coding in the discrete problem. These codes are arranged by s and 1s in the problem of knapsack 1- according to the number of objects just like a Chromosome in Genetic Algorithm. In this article, the crossover is used as an assimilation operator, exactly the same as genetic algorithm to define a new model for a discrete imperialist competitive algorithm. The best way to choose a revolution operator is applying the creating function of initial countries. If this operator get defined properly, it will be a proper substitute for the revolution operator for changing the continues condition to the discrete condition. Generally, new random changes can be defined and different revolution operators can be created in the countries according to the defined problems. The most important point is that due to such a revolution, the new country which is created by this operator should differ from the initial country extremely, randomly and unexpectedly (but not necessarily unexpectedly). The mutation operator is used in this article, the same operator which is used in genetic algorithm. Following algorithm is used to execute the assimilation operator: 1- Two genes are chosen randomly from the empire genes. 2- The existing amounts between these genes of the empire are replaced by the genes of colonies at the same place. 3- First and second levels repeat for all colonies of all empires. Tab.1 A sample on assimilation in 1 1 1 1 1 1 1 Imperialist Colony 1 Changed Colony 1 1 1 5. Simulation Results Results of the tests are offered in this part. Mentioned algorithms are executed in a system with 1.67 GHz processor and 1G of RAM memory, in MATLAB programming software. Utilized data in the problem are in concordance with the same data in [2]. The number of objects equals 16 in these data (n = 16), weight of each object(i =, 1,, 15): as 2. The value of all objects equals 1 ( 1). The optimum number (W) in this problem was arranged between and 11. The optimal response in this problem is: W = 8191. It should be mentioned that changing the weight of the knapsack in the determined period of time does not change the optimum response. It only makes the condition more complicated for its finding. All algorithms are executed at the same condition in 5 generations, and the best created amounts by them are 132

compared with different optimization algorithms in discrete binary imperialist competitive algorithm, the results are depicted in figures 1 to 5. Superiority of and its stability comparing with other optimization algorithms can be seen in all figures. 12 8 6 4 2 15 11 115 12 125 13 135 14 145 15 155 16 165 17 175 18 185 19 195 11 GA 12 8 6 4 2 TS 18 116 124 132 14 148 156 164 172 18 188 196 Fig. 1: Results of solving the knapsack 1- problem with GA and after 5 generations. GA is fluctuating; however is stable relatively. Fig. 2: Results of solving the knapsack 1- problem with TS algorithm and DB ICA after 5 generations. Though TS is relatively stable, like, this stability is on a specific response near the global optimum, But is stable on the main response. 12 8 6 4 2 SA 12 8 6 4 2 PSO 18 116 124 132 14 148 156 164 172 18 188 196 18 116 124 132 14 148 156 164 172 18 188 196 Fig. 3: Results of solving knapsack 1- problem with SA and after 5 generations. SA is extremely fluctuating; however is stable relatively. Fig. 4: Results of solving the knapsack 1- problem with PSO and after 5 generations. PSO is extremely fluctuating; however is stable relatively. 12 8 6 4 2 18 116 124 132 14 148 156 164 172 18 188 196 ACO Average of Optimal Results 84 82 8 78 76 74 72 7 68 66 GA TS SA PSO ACO Average Results Algorithms Fig. 5: Results of solving the knapsack 1- problem with ACO and after 5 generations. ACO is extremely fluctuating; however is stable relatively. Fig. 6: Average of the best optimal results after 5 repetition cycles in all algorithms. got the highest average. The average of optimals for each algorithm in 5 repetition cycles is depicted in figure 6. As you can see, possesses has the nearest average in approaching to optimal response. A scale is introduced in [18] to evaluate the level of efficiency in optimization algorithms in order to solve the problem of knapsack 1-, called Hit Rate. This scale calculates the level of ability in approaching the pattern to the optimal response, regarding the domain at the size of a Hit Scale. For example, if global optimization equals 1, and Hit Scale equals.3, then if the algorithm response equals a number between 7 to 13, it will be claimed that the response is hit, it means the Hit Rate equals 1. Consider the Hit Scale 133

equals a number between.2 and.3, so the level of algorithms ability in approaching to global optimum is being compared. possesses the nearest level to 1, as it can be seen in figure 7. Therefore this algorithm is certainly able to approach the optimal response. 6. Results And Future Works Firstly, a review of heuristic methods was accomplished in this article. Then a new method of imperialist competitive algorithm was offered to solve the binary discrete problems. Regarding the accomplished executions, it can be deduced that the offered binary model in algorithm ICA performs better than some other well-known heuristic algorithms. According to the Hit-Rate, performed 3% better than the best pattern in evaluating the level of efficiency. One advantage of suggested algorithm is the novelty of its idea as one of the first discrete algorithms on the basis of a social-political process, and also its ability in reaching the global optimum, comparing with different optimization algorithms faced the problem of knapsack 1-. Suggested pattern in this article has only focused on a sample of binary problem in evaluated discrete problems. It suggests that mentioned method can be utilized for other non-binary discrete problems in future articles. No comparison is done in the level of processor usage in this article, regarding the weakness of imperialist competitive algorithm in this field, optimizing the usage of the processor can be searched in discrete models in future researches as well. Hit Rate 1.9.8.7.6.5.4.3.2.1.2.4.6.8.1.12.14.16.18.2.22.24.26.28.3 Hit Scale GA TS SA PSO ACO Fig. 7: The level of Hit Rate in all tested algorithms. got the highest Hit Rate. 7. References [1] Atashpaz-Gargari, E. and C. Lucas. Imperialist competitive algorithm: an algorithm for optimization inspired by imperialistic competition. 27: IEEE. [2] Atashpaz-Gargari, E. and C. Lucas. Designing an optimal PID controller using Colonial Competitive Algorithm. 27. [3] Kaveh, A. and S. Talatahari, Optimum design of skeletal structures using imperialist competitive algorithm. Computers & Structures. 88(21-22): p. 122-1229. [4] Holland, J.H., Adaptation in natural and artificial systems: an introductory analysis with applications to biology, control, and artificial intelligence. 1992: The MIT press. [5] Ishibuchi, H., T. Murata, and S. Tomioka. Effectiveness of genetic local search algorithms. 1997. [6] Kennedy, J. and R.C. Eberhart. A discrete binary version of the particle swarm algorithm. 1997: IEEE. [7] Kennedy, J. and W.M. Spears. Matching algorithms to problems: an experimental test of the particle swarm and some genetic algorithms on the multimodal problem generator. 1998: IEEE. [8] Bak, P. and K. Sneppen, Punctuated equilibrium and criticality in a simple model of evolution. Physical Review Letters, 1993. 71(24): p. 483-486. [9] Moser, I. and T. Hendtlass. Solving problems with hidden dynamics-comparison of extremal optimization and ant colony system. 26. [1] Boettcher, S. and A.G. Percus, Optimization with extremal dynamics. Physical Review Letters, 21. 86(23): p. 5211-5214. 134

[11] Boettcher, S. and A. Percus, Nature's way of optimizing. Artificial Intelligence, 2. 119(1-2): p. 275-286. [12] Glover, F. and M. Laguna, Tabu search. Vol. 1. 1998: Kluwer Academic Pub. [13] Andersson, P., Using Tabu search to solve the -1 Knapsack. 24. [14] Metropolis, N., et al., Equation of state calculations by fast computing machines. The journal of chemical physics, 1953. 21(6): p. 187. [15] Cerny, V., Thermodynamical approach to the traveling salesman problem: An efficient simulation algorithm. Journal of optimization theory and applications, 1985. 45(1): p. 41-51. [16] Kirkpatrick, S., C.D. Gelatt, and M.P. Vecchi, Optimization by simulated annealing. science, 1983. 22(4598): p. 671. [17] Atashpaz-Gargari, E., Expanding the Social Optimization Algorithm and Evaluating its Efficiency. Computer and Electerical Engineeing, 28. Masters. [18] Wang, G., C. Chen, and K. Szeto, Accelerated genetic algorithms with Markov Chains. Nature Inspired Cooperative Strategies for Optimization (NICSO 21): p. 245-254. 135