Applying Particle Swarm Optimization to Software Testing
|
|
|
- Everett Edwards
- 9 years ago
- Views:
Transcription
1 Applying Particle Swarm Optimization to Software Testing Andreas Windisch DaimlerChrysler AG Research and Technology Alt-Moabit 96a, D-559 Berlin, Germany Phone: Stefan Wappler Technical University of Berlin DaimlerChrysler AITI Ernst-Reuter-Platz 7, D-587 Berlin, Germany Phone: Joachim Wegener DaimlerChrysler AG Research and Technology Alt-Moabit 96a, D-559 Berlin, Germany Phone: ABSTRACT Evolutionary structural testing is an approach to automatically generating test cases that achieve high structural code coverage. It typically uses genetic algorithms (s) to search for relevant test cases. In recent investigations particle swarm optimization (PSO), an alternative search technique, often outperformed s when applied to various problems. This raises the question of how PSO competes with s in the context of evolutionary structural testing. In order to contribute to an answer to this question, we performed experiments with 25 small artificial test objects and 3 more complex industrial test objects taken from various development projects. The results show that PSO outperforms s for most code elements to be covered in terms of effectiveness and efficiency. Categories and Subject Descriptors D.2.5 [Software Engineering]: Testing and Debugging Test coverage of code, Testing tools General Terms Verification Keywords evolutionary testing, genetic algorithm, particle swarm optimization, automatic test case generation. INTRODUCTION Evolutionary structural testing [8, 2, 4, 5], a searchbased approach to automatically generating relevant unit test cases, is a well-studied technique that has been shown to be successful for numerous academic test objects as well as for various industrial ones. Because of the complex search spaces that most test objects imply and that are often hard to understand due to their high dimensionality, genetic algorithms have been regarded as being especially adequate c ACM, 27. This is the author s version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version will be published in GECCO 7: Proceedings of the 27 conference on Genetic and evolutionary computation GECCO 7, July 7, 27, London, England, United Kingdom.. search strategies since they are able to deal with multimodality, non-linearity, and discontinuity quite well. However, genetic algorithms have started getting competition from other heuristic search techniques, such as the particle swarm optimization. Various works (e.g. [, 4, 5, 9, 7]) show that particle swarm optimization is equally well suited or even better than genetic algorithms for solving a number of test problems. At the same time, a particle swarm algorithm is much simpler, easier to implement and has a fewer number of parameters that the user has to adjust than a genetic algorithm. The performance and the above-mentioned characteristics of the particle swarm optimization favor its application as a search engine for evolutionary structural testing. This paper reports on our results from an empirical comparison of a genetic algorithm and a particle swarm algorithm applied to evolutionary structural testing. We selected 25 artificial test objects that cover a broad variety of search space characteristics (e.g. varying number of local optima), and 3 industrial test objects taken from various development project. The results indicate that particle swarm optimization is well-suited as a search engine for evolutionary structural testing and tends to outperform genetic algorithms in terms of code coverage achieved by the delivered test cases and the number of needed evaluations. The paper is structured as follows: section 2 introduces evolutionary structural testing, section 3 describes the used genetic algorithm and particle swarm optimization in more detail, section 3.3 deals with previous comparisons of PSO and in general, section 4 includes the results of the experiments, and section 5 summarizes the paper and gives suggestions for future work. 2. EVOLUTIONARY STRUCTURAL TEST- ING Since it is apparently impossible to exercise a software unit with all possible combinations of input data, a subset of test data must be found that is considered to be relevant or adequate, respectively. Therefore the generation of relevant test cases for a given software unit is a critical task that directly affects the quality of the overall testing of this unit. A test case is a set of input data with which the unit is to be executed, and the expected outcome. Often, the structural properties of the software unit are considered to answer the The term test problem does not relate to software testing but rather to well-defined optimization problems used to compare the performances of different optimization techniques.
2 question as to what relevant test cases are. For instance, a set of test cases that lead to the execution of each statement of the software unit under test are said to be adequate with respect to statement coverage. Branch coverage, another coverage criterion requiring all branches of the control flow graph of the unit under test to be traversed, is demanded by various industrial testing standards and guidelines. In general, the process of test case generation is timeconsuming and error-prone when done manually. Evolutionary structural testing (EST) [5, 8, 2, 4], an automatic test case generation technique, has been developed in order to provide relief. EST interprets the task of test case generation as an optimization problem and tries to solve it using a search technique, i.e. a genetic algorithm. A genetic algorithm is a meta-heuristic optimization technique, which is dealt with in more detail in section 3.. It requires a fitness function to be provided, which is used to assess the ability of a particular solution to solve the given optimization problem. Depending on the selected coverage criterion, the source code under test is partitioned into single test goals that have to be optimized separately. In the context of EST, the construction of fitness functions using the two distance metrics approximation level and branch distance has proven of value [4]. The approximation level relates to the control flow graph of the unit under test. It corresponds to the number of critical branches that are in between the problem node and the target (where the target is the code element to be covered and the problem node is the node at which execution diverges down a branch that makes it impossible to reach the target). Branch distance relates to the condition assigned to the problem node. It expresses how close the evaluation of this condition is to delivering the boolean result necessary for reaching the target node. 3. APPLIED SEARCH TECHNIQUES This section describes both search techniques to be compared, namely genetic search and particle swarm optimization. Section 3. and 3.2 characterize both strategies in detail whereas section 3.3 summarizes previous comparisons and their results. 3. Genetic Search Genetic search, carried out by a genetic algorithm, is a meta-heuristic optimization technique that mimics the principles of the Darwinian theory of biological evolution. Its adequacy for solving non-linear, multi-modal, and discontinuous optimization problems has drawn the attention of many researchers and practitioners during the last decades. A genetic algorithm works with a set of potential solutions for the given optimization problem. Through multiple modifications applied iteratively to the current set of solutions, better solutions and finally an optimum solution is supposed to be found. These modifications include crossover and mutation. While crossover generates new offspring solutions by combining multiple existing solutions, mutation randomly changes parts of an existing solution to yield a new one. The selection of candidate solutions that should undergo crossover and mutation is based on their fitness. The fitness of a solution is its ability to solve the optimization problem at hand; it is calculated using the fitness function. This function is problem-specific; its suitability essentially contributes to the success and performance of the optimization process. Figure shows the main workflow of a genetic Initialization of candidate solutions Evaluation of new candidate solutions Termination criterion met? N Selection of promising candidate solutions Crossover Mutation Y Result: Global best solution Figure : Workflow of a genetic algorithm algorithm. The evaluation of the new candidate solutions comprises the calculation of the fitness values. The algorithm terminates if either an optimum solution is found or other predefined termination criteria apply, e.g. a maximum number of iterations has been reached. 3.2 Particle Swarm Optimization In comparison with genetic search, the particle swarm optimization is a relatively recent optimization technique of the swarm intelligence paradigm. It was first introduced in 995 by Kennedy and Eberhart [, 2]. Inspired by social metaphors of behavior and swarm theory, simple methods were developed for efficiently optimizing non-linear mathematical functions. PSO simulates swarms such as herds of animals, flocks of birds or schools of fish. Similar to genetic search, the system is initialized with a population of random solutions, called particles. Each particle maintains its own current position, its present velocity and its personal best position explored so far. The swarm is also aware of the global best position achieved by all its members. The iterative appliance of update rules leads to a stochastic manipulation of velocities and flying courses. During the process of optimization the particles explore the D-dimensional space, whereas their trajectories can probably depend both on their personal experiences, on those of their neighbors and the whole swarm, respectively. This leads to further explorations of regions that turned out to be profitable. The best previous position of particle i is denoted by pbest i, the best previous position of the entire population is called gbest. Figure 2 shows the general workflow of a PSO-algorithm. The termination criterion can be either a specific fitness value, the achievement of a maximum number of iterations or the general convergence of the swarm itself. Since its first presentation, many improvements and extensions have been worked out to improve the algorithm in various ways and have provided promising results for the
3 Initialization of candidate solutions Evaluate fitnesses Update personal/global fitnesses (if necessary) Termination criterion met? Y Result: Global best solution where the D-dimensional vectors x i = (x i,..., x d i,..., x D i ) and v i = (vi,..., vi d,..., vi D ), with x d i [ lb d, ub d], vi d [ ] V d max, Vmax d, d [, D] represent the position and velocity of particle i. lb d, ub d describe dth dimension s lower and upper bounds whereas Vmax d defines its respective maximum and minimum velocity. Accordingly, pbest d f i (d) is the personal best position found by the particle assigned for dimension d. The inertia weight ω controls the impact of the previous history on the new velocity. c is an acceleration coefficient weighting the influence of the cognitive and social component respectively in proportion to ith particle s present momentum. Finally, ri d is a uniformly distributed random variable in the range of [, ] for dimension d. The fitness of a particle often depends on all D parameters. Hence a particle close to the optimum in some dimensions can probably be evaluated with a poor fitness when processed by the original PSO version, due to the poor solutions of the remaining dimensions. This is counteracted by the learning strategy presented in [], which consequently enables higher quality solutions to be located. It was shown that in comparison to some other PSO variants yields significantly better solutions for multi-modal problems. Owing to this, we chose as PSO variant for the upcoming comparison; its detailed configuration is shown in section 4.2. N Update velocities Update positions Figure 2: Workflow of a PSO algorithm optimization of well-known test functions. A novel and auspicious approach is the Comprehensive Learning Particle Swarm Optimizer [] (). It applies a new learningstrategy, where each particle learns from different neighbors for each dimension separately dependent on its assigned learning rate P c i. This happens until the particle does not achieve any further improvements for a specific number of iterations called the refreshing gap m; finally yielding a reassignment of particles. The mentioned learning-rate P c i of particle i is a probability lying between.5 and.5, determining whether particle i learns from its own or another particle s pbest for the current dimension. This probability is assigned to each particle during initialization and remains unchanged for assuring the particle s diverse levels of exploration and exploitation abilities. If the considered dimension is to be learned from another particle s pbest, this particle will be appointed in a tournament selection manner: two particles are randomly chosen and the better one is selected. This is done for each dimension d and the resulting list of particles f i(d) is used in the following velocity update rule for time step t: v d i (t) ω v d i (t )+c r d i (pbest d f i (d)(t ) x d i (t )) () 3.3 Comparison of and PSO Genetic algorithms have been popular because of the parallel nature of their search and essentially because of their ability to effectively solve non-linear, multi-modal problems. They can handle both discrete and continuous variables without requiring gradient information. In comparison, PSO is well-known for its easy implementation, its computational inexpensiveness and its fast convergence to optimal areas of the solution space. Although it yields its best performance on continuous-valued problems, it can also handle discrete variables after slight modifications. Many researchers have compared both optimization techniques over the last years. Hodgson [5] compared them by applying them to the atomic cluster optimization problem. The task consists of minimizing a highly multi-modal energy function of clusters of atoms. His calculations show PSO to be noticeably superior to both a generic and a purpose-built problem-specific. Clow and White [] compared both techniques by using them to train artificial neural networks used to control virtual racecars. Due to the continuousness of the neural weights being optimized, PSO turned out to be superior to for all accomplished tests - yielding a higher and much faster growing mean fitness. Hassan, Cohanim and De Weck [4] compared both techniques with a set of eight well-known optimization benchmark test problems, and concluded that PSO was equally effective and more efficient in general. Nevertheless, the superiority of PSO turned out to be problem-dependent. The difference in computational efficiency was found to be greater when the search strategies were used to solve unconstrained problems with continuous variables and less when they were applied to constrained continuous or discrete variables. Jones [9] also compared both approaches by letting them identify two mathematical model parameters. Although he noted that both approaches were equally effective, he concluded that outperformed PSO with regard to efficiency. However, attention should be paid to Jones PSO variant, as it used the same random variables for all dimensions during one velocity update - which turned out to perform worse than using different variables for each dimension as proposed originally. This could be the reason for the poor results for PSO presented by Jones. Horák, Chmela, Oliva and Raida [7] analyzed the abilities of PSO and to optimize dual-band planar antennas for mobile communication applications. They found that PSO was able to obtain slightly better results than, but that PSO took more cpu-time. The higher effectiveness and efficiency generally ascribed to PSO leads to the hypothesis that it will improve evolutionary structural testing, too.
4 4. EMPIRICAL COMPARISON This section describes the experimental setup and the achieved results. Sections 4. and 4.2 describe the test system used to realize evolutionary structural testing and the configuration of both search engines to be compared, respectively. Section 4.3 provides some insight into the test objects used and section 4.4 presents the obtained results. 4. Test System Over the last years, DaimlerChrysler has developed a test system that implements the ideas of evolutionary structural testing [4]. Figure 3 shows a high-level view of the structure of this system. The inputs to this system are the source Source Code Parser Test Object Test Control Test Preparer Compiler Solution Format Candidate Solutions Interface Settings Linker Search Engine ditionally, it provides descriptive metadata for the test object, such as the interface specification and the control flow graph. Test Control takes the instrumented test object and performs individual optimizations for each code element to be covered (e.g. in the case of branch coverage, it runs an optimization for each branch of the control flow graph of the function under test). An optimization consists of the following steps: Initially, the interface of the function under test (the input variables) is reported to the search engine in order to generate test inputs that correspond to the functional interface. Afterwards, the search engine provides a set of test inputs. Test Control executes the function under test using the provided inputs and calculates the fitness values based on the produced execution flow. Then, it reports the fitness values to the search engine and waits for the next set of test inputs unless a test input is found that covers the code element under question or another termination criterion applies. These steps are repeated for each test goal. Finally, Test Control creates a final set of test cases that are delivered to the user. The test system has normally been configured to use the Genetic and Evolutionary Algorithms Toobox (GEATbx) [3] as a search engine. For our experiments, we developed a new toolbox that implements the PSO algorithms mentioned in section 3.2 and provides the same interface as the GEATbx. Hence, the test system can easily switch between the two different search engines. 4.2 Configuration of the search algorithms Except for two slight modifications described below, we used with all variables set to the values suggested in []. The configuration is shown in table. The men- Instrumenter Metadata Fitness Values Final Set of Test Cases Parameter Value No. of particles 4 Inertia weight ω Lin. decreased:.9 to.4 Acceleration coefficient c Velocity bound Vmax d (ub d lb d )/2 Refreshing gap m 7 Boundary condition Damping walls Termination 2 iterations Figure 3: Abstract view of test system structure code under test (including the main module, all required header files and dependent modules) and the interface settings. These settings describe which function to test and which value ranges to use for the input variables. For instance, binary signal variables represented by an integer input variable in the source code might be restricted to the values {, }. The output of the test system is a set of test cases that are intended to achieve a high structural coverage of the given source code. At the moment, the coverage metrics statement coverage, branch coverage, and two versions of condition coverage are implemented and selectable by the user. The test system consists of the three major components Test Preparer, Test Control, and a search engine. Test Preparer analyzes the given source code and adds the instrumentation statements. These statements are necessary to comprehend the execution flow taken when executing the function under test with a particular set of input values. It also compiles and links the instrumented source code. Ad- Table : PSO settings tioned modifications concern the ability to both handle discrete variables and a changed search bounds condition. The former is solved by simply rounding potential solutions to the nearest integer number when the variables being investigated are discrete. originally uses a boundary condition known as invisible walls [3], meaning that the particles are allowed to exceed the borders, but their fitness will solely be calculated and updated if they are within the range. However our experiments with different boundary conditions have shown that this behavior can lead to a sustained and undesired non-observance of particles in some particular cases. Other conditions such as absorbing and reflecting walls [3] proved to be more or less appropriate depending on the problem. Accordingly, we decided to use the damping walls approach proposed by Huang and Mohan [6], realizing a composition of both absorbing and reflecting walls by stochastically reducing the momentum during reflection of particles on the imaginary boundary walls back to the search space. Table 2 shows the settings used for the experiments that
5 applied the as search engine. These settings have emerged over the last few years by experimentation with numerous different industrial test objects. The first 3 rows indicate parameter value subpopulations number 6 size 4 individuals each competition interval rate. min. size migration topology complete net interval 3 rate. selection name stochastic universal sampling pressure.7 generation gap.9 crossover type discrete rate. mutation type real number precision 7 mutation ranges subpop. #. subpop. #2. subpop. #3. subpop. #4. subpop. #5. subpop. #6. reinsertion type fitness-based termination generations 2 Table 2: settings that the implements a regional model with 6 subpopulations. Due to the different mutation rates per subpopulation, some subpopulations might be more successful than others. Competition among these subpopulations means that more successful subpopulations receive individuals from less successful ones during the search. Regardless of competition, the best individuals are exchanged among the subpopulations each 3 generations via migration. We decided to use branch coverage for the coverage criterion since its relevance is widely accepted and enforced by various industrial quality standards. Branch coverage measures the number of branches of the control flow graph of the function under test that are traversed when the generated test cases are executed. For the purpose of test case generation, each branch becomes an individual test goal for which an individual optimization is carried out. In order to acquire results with sufficient statistical significance, all experiments for both and PSO were repeated 3 times. 4.3 Test Objects A total of 25 small artificial test objects have been created that feature different grades of complexity. They differ in both the number and type of parameters and the number of local optima. The local optima relate to the search space of the most difficult test goal that each test object exhibits. Each test object has one condition that needs to be satisfied - in the context of branch coverage in order to successfully traverse its path of the corresponding control flow graph. In this context, one branch is typically easy to be covered without requiring an optimization while its sibling branch is relatively hard to be covered and requires an optimization. The characteristics of the test objects are shown in table 3. Column vars shows the number of input parameters the boolean integer double test object vars local opt. f x f2 x f3 x f4 x f5 x f6 x f7 4 x f8 4 x f9 2 x f 2 x f 2 x f x x f3 3 x f4 3 x f5 3 x f6 3 x x x f x x x f8 6 x f9 6 x f2 6 x f2 6 x x x f22 2 x f23 2 x f24 2 x f25 2 x x x Table 3: Artificial test objects test object possesses and that are to be optimized. Local opt. shows the number of local optima of the search space. Column boolean, integer and double indicate the types of parameter. For example, f2 is a function with two parameters (one of type integer and one of type double), constructed to realize the fitness landscape shown in figure 4. It features a total of 24 local optima around the global one in position (, ). Double valued parameters were limited to the range [ e6, e6] to minimize the huge search space. Fitness Variable 2 (double) Variable (integer) Figure 4: Fitness landscape of true branch of test object f2 Our assortment of industrial test objects is shown in table 4. These functions are taken for example from current 2 4
6 Test object / Function Lines of code No. of branches No. of variables BrakeAssistant BrakeAssistant ClassifyTriangle De-icer De-icer EmergencyBrake ICCDeterminer LogicModule Multimedia Preprocessor PrototypingTool Reaction Warning 44 3 Table 4: Industrial test objects Mercedes and Chrysler development projects. They differ in complexity with regards to code length and both the numbers and types of variables. Both BrakeAssistant and BrakeAssistant2 are used for brake coordination in the brake assist system, whereas ClassifyTriangle is an often used test function, that performs a classification of triangles based on their side lengths. De-icer and De-icer2 are both intended to control the windshield defroster heating units. EmergencyBrake is a part of an emergency brake system. ICCDeterminer determines the reasons why the cruise control of the vehicle could not be activated. LogicModule is used for the definition of the current operational mode of the engine and Multimedia for the handling of peripheral equipment. Preprocessor realizes parts of the object preprocessing required for situation analysis, and PrototypingTool controls the interaction of new code with a prototype of the engine for testing purposes. Reaction supervises possible system modes and determines the expected reaction on emerging failures, and Warning manages warnings that occur. 4.4 Results The convergence characteristics of the artificial test objects are shown in figure 5. The results for the test objects f, f4, f9, f3, f8 and f22 have been omitted. These test objects exclusively use boolean variables and both and PSO found optimum solutions during initialization. Considering the functions that use several parameters of the same type (functions f2, f, f4, f9 and f23 for type integer and functions f3, f, f5, f2 and f24 for type double), it becomes obvious that both techniques need more time with the more parameters there are. An analysis of these results in detail reveals that was unable to reach the desired global optimum within the given number of fitness function evaluations when it had to optimize more than one parameter. In contrast, PSO reached it when optimizing one, two and three parameters, regardless of the parameter type. Although both algorithms were unable to converge successfully when optimizing six or twelve parameters, PSO reached better solutions using the same number of iterations. The inclusion of artificially constructed local optima (functions f5, f6, f7 and f8 ) yielded similar results. For the functions that use a mixed parameter set including boolean, integer and double valued parameters (functions f6, f2 and f25 ), PSO outperformed in all cases. Either it reached the global optima using less function evaluations or it yielded a better result after the expiration of permitted evaluations. Adding local optima to two simple mixed parameter functions (f2 and f7 ) resulted in a slightly faster convergence of compared to PSO. In 3 of the 9 cases shown, PSO outperformed, while in the remaining 6 cases outperformed PSO. However, while the difference is significant in 8 of the 3 cases in which PSO outperformed, there is no case in which the superiority of the is significant. In general, features a slightly faster convergence for simple functions whereas PSO outperforms primarily for complex functions with big search spaces. Additionally, PSO achieved either an equal or a significantly better solution compared to for all artificial test objects. Figure 6 presents the success rates of both and PSO for the industrial test objects. Success rate means the relative frequency of a successful optimization (covering test case found), averaged over all test goals (which correspond to the branches) of a test object. The figure shows that PSO was able to find a covering test case for more test goals than. Success rate (%) BrakeAssistant BrakeAssistant2 ClassifyTriangle De-icer De-icer2 Effectiveness EmergencyBrake ICCDeterminer Testobjects LogicModule Multimedia Preprocessor PrototypingTool Reaction Warning Figure 6: Experimental results for the effectiveness of and PSO regarding the industrial test objects Figure 7 summarizes the efficiency of the two optimization strategies for the industrial test objects. It shows the number of fitness function evaluations required for finding a covering test case or for failing - averaged over the test goals of each test object. The figure also shows the estimated standard deviation. In general, needed more evaluations Fitness function evaluations BrakeAssistant BrakeAssistant2 ClassifyTriangle De-icer De-icer2 Efficiency EmergencyBrake ICCDeterminer Testobjects LogicModule Multimedia Preprocessor PrototypingTool Reaction Warning Figure 7: Experimental results for the efficiency of and PSO regarding the industrial test objects than PSO for all of the test objects. The smaller estimated standard deviations of the results of PSO indicate that PSO delivers a more reliable result than does. The huge standard deviations are due to the averaging over each test goal of the test objects. Some of these test goals were very easy to reach whereas others could not be reached at all or needed more generations or iterations respectively. Although the average number of fitness function evaluations
7 Test object f Test object f Test object f Test object f Test object f Test object f Test object f Test object f Test object f Test object f Test object f Test object f Test object f Test object f Test object f Test object f Test object f Test object f Test object f7 Figure 5: The convergence characteristics of and PSO regarding the artificial test objects: the x-axis shows the number of fitness function evaluations, the y-axis shows the best fitness value. Results are averaged over 3 runs. Note that the range of the y-axis may be different for different charts.
8 for PSO is smaller than for, it is worth mentioning that some branches could be covered faster by. In order to examine the statistical significance of the results, a t-test was performed for all the test goals of each test object. The outcome of this test indicates that the majority of the differences in efficiency between s and PSO was statistically significant. About 8% of the differences turned out to be statistically significant (confidence 95%). Overall 74% of the differences turned out to be even very significant (confidence 99%). The obtained results lead to the conclusion that, when applied to evolutionary structural testing, PSO tends to outperform for complex functions in terms of effectiveness and efficiency. 5. CONCLUSION AND FUTURE WORK This paper reported on the application of particle swarm optimization to structural software testing. We described the design of our test system that allows easy exchange of the employed search strategy. This system was used to carry out experiments with 25 artificial and 3 industrial test objects that exhibit different search space properties. Both particle swarm optimization and genetic algorithms were used to automatically generate test cases for the same set of test objects. The results of the experiments show that particle swarm optimization is competitive with genetic algorithms and even outperforms them for complex cases. Even though the genetic algorithm yields a covering test case faster than particle swarm optimization in some cases, the latter is much faster than the genetic algorithm in the majority of the cases. This result indicates that particle swarm optimization is an attractive alternative to genetic algorithms since it is just as good as or even better than genetic algorithms in terms of effectiveness and efficiency, and is a much simpler algorithm with significantly fewer parameters that need to be adjusted by the user. However, more experiments with further test objects taken from various application domains must be carried out in order to be able to make more general statements about the relative performance of particle swarm optimization and genetic algorithms when applied to software testing. Systematically varying the algorithm settings for the experiments would also help to draw more comprehensive conclusions. As an alternative comparison approach, a theoretical analysis of both optimization techniques in the context of software testing would give some more insights as to the suitability of either approach. These directions are all items for future work. The development of a testability meta-heuristic that selects the most promising search strategy for each test goal individually is especially interesting. Maybe software measures can help support this selection heuristic. Then, we would not only assume that genetic algorithms and particle swarm optimization are relevant search strategies, but would also broaden the spectrum of candidate search techniques, such as hill climbing, simulated annealing or even random search. Acknowledgments We would like to thank our colleague Harmen Sthamer for the thorough review of this paper and his valuable feedback. This work was funded by EU grant IST (EvoTest). 6. REFERENCES [] B. Clow and T. White. An evolutionary race: A comparison of genetic algorithms and particle swarm optimization for training neural networks. In Proceedings of the International Conference on Artificial Intelligence, IC-AI 4, Volume 2, pages CSREA Press, 24. [2] R. C. Eberhart and J. Kennedy. A new optimizer using particle swarm theory. In Proceedings of the 6th International Symposium on Micromachine Human Science, pages 39 43, 995. [3] Genetic and Evolutionary Algorithm Toolbox for use with Matlab. [4] R. Hassan, B. Cohanim, and O. de Weck. A comparison of particle swarm optimization and the genetic algorithm. In Proceedings of the 46th AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics and Materials Conference, 25. [5] R. J. W. Hodgson. Partical swarm optimization applied to the atomic cluster optimization problem. In GECCO, pages 68 73, 22. [6] T. Huang and A. S. Mohan. A hybrid boundary condition for robust particle swarm optimization. IEEE Antennas and Wireless Propagation Letters, 4:2 7, 25. [7] L. Oliva J. Horák, P. Chmela and Z. Raida. Global optimization of the dual-band planar antenna: Pso versus ga. In Radioelektronika, 26. [8] B. F. Jones, H. Sthamer, and D. E. Eyres. Automatic test data generation using genetic algorithms. Software Engineering Journal, (5):299 36, September 996. [9] K. O. Jones. Comparison of genetic algorithm and particle swarm optimization. In Proceedings of the International Conference on Computer Systems and Technologies, 25. [] J. Kennedy and R. C. Eberhart. Particle swarm optimization. In Proceedings of the IEEE International Conference on Neural Networks, volume 4, pages vol.4. IEEE Press, 995. [] J. J. Liang, A. K. Qin, P. N. Suganthan, and S. Baskar. Comprehensive learning particle swarm optimizer for global optimization of multimodal functions. IEEE Transactions on Evolutionary Computation, :28 295, 26. [2] R. P. Pargas, M. J. Harrold, and R. R. Peck. Test-data generation using genetic algorithms. Journal of Software Testing, Verification and Reliability, 9(4): , 999. [3] J. Robinson and Y. Rahmat-Samii. Particle swarm optimization in electromagnetics. IEEE Transactions on Antennas and Propagation, 52:397 47, Feb. 24. [4] J. Wegener, A. Baresel, and H. Sthamer. Evolutionary test environment for automatic structural testing. Information and Software Technology, 43():84 854, 2. [5] S. E. Xanthakis, C. C. Skourlas, and A.K. LeGall. Application of genetic algorithms to software testing. In Proceedings of the 5th International Conference on Software Engineering and its Applications, pages , 992.
A Novel Binary Particle Swarm Optimization
Proceedings of the 5th Mediterranean Conference on T33- A Novel Binary Particle Swarm Optimization Motaba Ahmadieh Khanesar, Member, IEEE, Mohammad Teshnehlab and Mahdi Aliyari Shoorehdeli K. N. Toosi
A hybrid Approach of Genetic Algorithm and Particle Swarm Technique to Software Test Case Generation
A hybrid Approach of Genetic Algorithm and Particle Swarm Technique to Software Test Case Generation Abhishek Singh Department of Information Technology Amity School of Engineering and Technology Amity
14.10.2014. Overview. Swarms in nature. Fish, birds, ants, termites, Introduction to swarm intelligence principles Particle Swarm Optimization (PSO)
Overview Kyrre Glette kyrrehg@ifi INF3490 Swarm Intelligence Particle Swarm Optimization Introduction to swarm intelligence principles Particle Swarm Optimization (PSO) 3 Swarms in nature Fish, birds,
CLOUD DATABASE ROUTE SCHEDULING USING COMBANATION OF PARTICLE SWARM OPTIMIZATION AND GENETIC ALGORITHM
CLOUD DATABASE ROUTE SCHEDULING USING COMBANATION OF PARTICLE SWARM OPTIMIZATION AND GENETIC ALGORITHM *Shabnam Ghasemi 1 and Mohammad Kalantari 2 1 Deparment of Computer Engineering, Islamic Azad University,
Dynamic Generation of Test Cases with Metaheuristics
Dynamic Generation of Test Cases with Metaheuristics Laura Lanzarini, Juan Pablo La Battaglia III-LIDI (Institute of Research in Computer Science LIDI) Faculty of Computer Sciences. National University
Empirically Identifying the Best Genetic Algorithm for Covering Array Generation
Empirically Identifying the Best Genetic Algorithm for Covering Array Generation Liang Yalan 1, Changhai Nie 1, Jonathan M. Kauffman 2, Gregory M. Kapfhammer 2, Hareton Leung 3 1 Department of Computer
XOR-based artificial bee colony algorithm for binary optimization
Turkish Journal of Electrical Engineering & Computer Sciences http:// journals. tubitak. gov. tr/ elektrik/ Research Article Turk J Elec Eng & Comp Sci (2013) 21: 2307 2328 c TÜBİTAK doi:10.3906/elk-1203-104
Optimization of PID parameters with an improved simplex PSO
Li et al. Journal of Inequalities and Applications (2015) 2015:325 DOI 10.1186/s13660-015-0785-2 R E S E A R C H Open Access Optimization of PID parameters with an improved simplex PSO Ji-min Li 1, Yeong-Cheng
International Journal of Software and Web Sciences (IJSWS) www.iasir.net
International Association of Scientific Innovation and Research (IASIR) (An Association Unifying the Sciences, Engineering, and Applied Research) ISSN (Print): 2279-0063 ISSN (Online): 2279-0071 International
International Journal of Emerging Technologies in Computational and Applied Sciences (IJETCAS) www.iasir.net
International Association of Scientific Innovation and Research (IASIR) (An Association Unifying the Sciences, Engineering, and Applied Research) International Journal of Emerging Technologies in Computational
APPLICATION OF ADVANCED SEARCH- METHODS FOR AUTOMOTIVE DATA-BUS SYSTEM SIGNAL INTEGRITY OPTIMIZATION
APPLICATION OF ADVANCED SEARCH- METHODS FOR AUTOMOTIVE DATA-BUS SYSTEM SIGNAL INTEGRITY OPTIMIZATION Harald Günther 1, Stephan Frei 1, Thomas Wenzel, Wolfgang Mickisch 1 Technische Universität Dortmund,
Web Service Selection using Particle Swarm Optimization and Genetic Algorithms
Web Service Selection using Particle Swarm Optimization and Genetic Algorithms Simone A. Ludwig Department of Computer Science North Dakota State University Fargo, ND, USA [email protected] Thomas
A Binary Model on the Basis of Imperialist Competitive Algorithm in Order to Solve the Problem of Knapsack 1-0
212 International Conference on System Engineering and Modeling (ICSEM 212) IPCSIT vol. 34 (212) (212) IACSIT Press, Singapore A Binary Model on the Basis of Imperialist Competitive Algorithm in Order
A New Quantitative Behavioral Model for Financial Prediction
2011 3rd International Conference on Information and Financial Engineering IPEDR vol.12 (2011) (2011) IACSIT Press, Singapore A New Quantitative Behavioral Model for Financial Prediction Thimmaraya Ramesh
BMOA: Binary Magnetic Optimization Algorithm
International Journal of Machine Learning and Computing Vol. 2 No. 3 June 22 BMOA: Binary Magnetic Optimization Algorithm SeyedAli Mirjalili and Siti Zaiton Mohd Hashim Abstract Recently the behavior of
PLAANN as a Classification Tool for Customer Intelligence in Banking
PLAANN as a Classification Tool for Customer Intelligence in Banking EUNITE World Competition in domain of Intelligent Technologies The Research Report Ireneusz Czarnowski and Piotr Jedrzejowicz Department
Dynamic Task Scheduling with Load Balancing using Hybrid Particle Swarm Optimization
Int. J. Open Problems Compt. Math., Vol. 2, No. 3, September 2009 ISSN 1998-6262; Copyright ICSRS Publication, 2009 www.i-csrs.org Dynamic Task Scheduling with Load Balancing using Hybrid Particle Swarm
Open Access Research on Application of Neural Network in Computer Network Security Evaluation. Shujuan Jin *
Send Orders for Reprints to [email protected] 766 The Open Electrical & Electronic Engineering Journal, 2014, 8, 766-771 Open Access Research on Application of Neural Network in Computer Network
Search Algorithm in Software Testing and Debugging
Search Algorithm in Software Testing and Debugging Hsueh-Chien Cheng Dec 8, 2010 Search Algorithm Search algorithm is a well-studied field in AI Computer chess Hill climbing A search... Evolutionary Algorithm
SOFTWARE TESTING STRATEGY APPROACH ON SOURCE CODE APPLYING CONDITIONAL COVERAGE METHOD
SOFTWARE TESTING STRATEGY APPROACH ON SOURCE CODE APPLYING CONDITIONAL COVERAGE METHOD Jaya Srivastaval 1 and Twinkle Dwivedi 2 1 Department of Computer Science & Engineering, Shri Ramswaroop Memorial
Euler: A System for Numerical Optimization of Programs
Euler: A System for Numerical Optimization of Programs Swarat Chaudhuri 1 and Armando Solar-Lezama 2 1 Rice University 2 MIT Abstract. We give a tutorial introduction to Euler, a system for solving difficult
Effect of Using Neural Networks in GA-Based School Timetabling
Effect of Using Neural Networks in GA-Based School Timetabling JANIS ZUTERS Department of Computer Science University of Latvia Raina bulv. 19, Riga, LV-1050 LATVIA [email protected] Abstract: - The school
Optimal PID Controller Design for AVR System
Tamkang Journal of Science and Engineering, Vol. 2, No. 3, pp. 259 270 (2009) 259 Optimal PID Controller Design for AVR System Ching-Chang Wong*, Shih-An Li and Hou-Yi Wang Department of Electrical Engineering,
A RANDOMIZED LOAD BALANCING ALGORITHM IN GRID USING MAX MIN PSO ALGORITHM
International Journal of Research in Computer Science eissn 2249-8265 Volume 2 Issue 3 (212) pp. 17-23 White Globe Publications A RANDOMIZED LOAD BALANCING ALGORITHM IN GRID USING MAX MIN ALGORITHM C.Kalpana
AN APPROACH FOR SOFTWARE TEST CASE SELECTION USING HYBRID PSO
INTERNATIONAL JOURNAL OF RESEARCH IN COMPUTER APPLICATIONS AND ROBOTICS ISSN 2320-7345 AN APPROACH FOR SOFTWARE TEST CASE SELECTION USING HYBRID PSO 1 Preeti Bala Thakur, 2 Prof. Toran Verma 1 Dept. of
Research on the Performance Optimization of Hadoop in Big Data Environment
Vol.8, No.5 (015), pp.93-304 http://dx.doi.org/10.1457/idta.015.8.5.6 Research on the Performance Optimization of Hadoop in Big Data Environment Jia Min-Zheng Department of Information Engineering, Beiing
Wireless Sensor Networks Coverage Optimization based on Improved AFSA Algorithm
, pp. 99-108 http://dx.doi.org/10.1457/ijfgcn.015.8.1.11 Wireless Sensor Networks Coverage Optimization based on Improved AFSA Algorithm Wang DaWei and Wang Changliang Zhejiang Industry Polytechnic College
Evolutionary Detection of Rules for Text Categorization. Application to Spam Filtering
Advances in Intelligent Systems and Technologies Proceedings ECIT2004 - Third European Conference on Intelligent Systems and Technologies Iasi, Romania, July 21-23, 2004 Evolutionary Detection of Rules
Load balancing in a heterogeneous computer system by self-organizing Kohonen network
Bull. Nov. Comp. Center, Comp. Science, 25 (2006), 69 74 c 2006 NCC Publisher Load balancing in a heterogeneous computer system by self-organizing Kohonen network Mikhail S. Tarkov, Yakov S. Bezrukov Abstract.
Alpha Cut based Novel Selection for Genetic Algorithm
Alpha Cut based Novel for Genetic Algorithm Rakesh Kumar Professor Girdhar Gopal Research Scholar Rajesh Kumar Assistant Professor ABSTRACT Genetic algorithm (GA) has several genetic operators that can
An ACO Approach to Solve a Variant of TSP
An ACO Approach to Solve a Variant of TSP Bharat V. Chawda, Nitesh M. Sureja Abstract This study is an investigation on the application of Ant Colony Optimization to a variant of TSP. This paper presents
Parallel Multi-Swarm Optimization Framework for Search Problems in Water Distribution Systems
Parallel Multi-Swarm Optimization Framework for Search Problems in Water Distribution Systems Sarat Sreepathi 1, Downey Brill 2, Ranji Ranjithan 2, Gnanamanikam (Kumar) Mahinthakumar 2 {sarat s,brill,ranji,gmkumar}@ncsu.edu
Research Article Service Composition Optimization Using Differential Evolution and Opposition-based Learning
Research Journal of Applied Sciences, Engineering and Technology 11(2): 229-234, 2015 ISSN: 2040-7459; e-issn: 2040-7467 2015 Maxwell Scientific Publication Corp. Submitted: May 20, 2015 Accepted: June
A Hybrid Model of Particle Swarm Optimization (PSO) and Artificial Bee Colony (ABC) Algorithm for Test Case Optimization
A Hybrid Model of Particle Swarm Optimization (PSO) and Artificial Bee Colony (ABC) Algorithm for Test Case Optimization Abraham Kiran Joseph a, Dr. G. Radhamani b * a Research Scholar, Dr.G.R Damodaran
Optimal Tuning of PID Controller Using Meta Heuristic Approach
International Journal of Electronic and Electrical Engineering. ISSN 0974-2174, Volume 7, Number 2 (2014), pp. 171-176 International Research Publication House http://www.irphouse.com Optimal Tuning of
Journal of Theoretical and Applied Information Technology 20 th July 2015. Vol.77. No.2 2005-2015 JATIT & LLS. All rights reserved.
EFFICIENT LOAD BALANCING USING ANT COLONY OPTIMIZATION MOHAMMAD H. NADIMI-SHAHRAKI, ELNAZ SHAFIGH FARD, FARAMARZ SAFI Department of Computer Engineering, Najafabad branch, Islamic Azad University, Najafabad,
EM Clustering Approach for Multi-Dimensional Analysis of Big Data Set
EM Clustering Approach for Multi-Dimensional Analysis of Big Data Set Amhmed A. Bhih School of Electrical and Electronic Engineering Princy Johnson School of Electrical and Electronic Engineering Martin
Improved Particle Swarm Optimization in Constrained Numerical Search Spaces
Improved Particle Swarm Optimization in Constrained Numerical Search Spaces Efrén Mezura-Montes and Jorge Isacc Flores-Mendoza Abstract This chapter presents a study about the behavior of Particle Swarm
D A T A M I N I N G C L A S S I F I C A T I O N
D A T A M I N I N G C L A S S I F I C A T I O N FABRICIO VOZNIKA LEO NARDO VIA NA INTRODUCTION Nowadays there is huge amount of data being collected and stored in databases everywhere across the globe.
Optimization of sampling strata with the SamplingStrata package
Optimization of sampling strata with the SamplingStrata package Package version 1.1 Giulio Barcaroli January 12, 2016 Abstract In stratified random sampling the problem of determining the optimal size
D-optimal plans in observational studies
D-optimal plans in observational studies Constanze Pumplün Stefan Rüping Katharina Morik Claus Weihs October 11, 2005 Abstract This paper investigates the use of Design of Experiments in observational
Parallelized Cuckoo Search Algorithm for Unconstrained Optimization
Parallelized Cuckoo Search Algorithm for Unconstrained Optimization Milos SUBOTIC 1, Milan TUBA 2, Nebojsa BACANIN 3, Dana SIMIAN 4 1,2,3 Faculty of Computer Science 4 Department of Computer Science University
Comparing Algorithms for Search-Based Test Data Generation of Matlab R Simulink R Models
Comparing Algorithms for Search-Based Test Data Generation of Matlab R Simulink R Models Kamran Ghani, John A. Clark and Yuan Zhan Abstract Search Based Software Engineering (SBSE) is an evolving field
Path Selection Methods for Localized Quality of Service Routing
Path Selection Methods for Localized Quality of Service Routing Xin Yuan and Arif Saifee Department of Computer Science, Florida State University, Tallahassee, FL Abstract Localized Quality of Service
Flexible Neural Trees Ensemble for Stock Index Modeling
Flexible Neural Trees Ensemble for Stock Index Modeling Yuehui Chen 1, Ju Yang 1, Bo Yang 1 and Ajith Abraham 2 1 School of Information Science and Engineering Jinan University, Jinan 250022, P.R.China
NEUROEVOLUTION OF AUTO-TEACHING ARCHITECTURES
NEUROEVOLUTION OF AUTO-TEACHING ARCHITECTURES EDWARD ROBINSON & JOHN A. BULLINARIA School of Computer Science, University of Birmingham Edgbaston, Birmingham, B15 2TT, UK [email protected] This
An Overview of Knowledge Discovery Database and Data mining Techniques
An Overview of Knowledge Discovery Database and Data mining Techniques Priyadharsini.C 1, Dr. Antony Selvadoss Thanamani 2 M.Phil, Department of Computer Science, NGM College, Pollachi, Coimbatore, Tamilnadu,
USING SPECTRAL RADIUS RATIO FOR NODE DEGREE TO ANALYZE THE EVOLUTION OF SCALE- FREE NETWORKS AND SMALL-WORLD NETWORKS
USING SPECTRAL RADIUS RATIO FOR NODE DEGREE TO ANALYZE THE EVOLUTION OF SCALE- FREE NETWORKS AND SMALL-WORLD NETWORKS Natarajan Meghanathan Jackson State University, 1400 Lynch St, Jackson, MS, USA [email protected]
EVOLUTIONARY ALGORITHMS FOR FIRE AND RESCUE SERVICE DECISION MAKING
EVOLUTIONARY ALGORITHMS FOR FIRE AND RESCUE SERVICE DECISION MAKING Dr. Alastair Clarke, Prof. John Miles and Prof. Yacine Rezgui Cardiff School of Engineering, Cardiff, UK ABSTRACT Determining high performance
Estimation of the COCOMO Model Parameters Using Genetic Algorithms for NASA Software Projects
Journal of Computer Science 2 (2): 118-123, 2006 ISSN 1549-3636 2006 Science Publications Estimation of the COCOMO Model Parameters Using Genetic Algorithms for NASA Software Projects Alaa F. Sheta Computers
HYBRID GENETIC ALGORITHM PARAMETER EFFECTS FOR OPTIMIZATION OF CONSTRUCTION RESOURCE ALLOCATION PROBLEM. Jin-Lee KIM 1, M. ASCE
1560 HYBRID GENETIC ALGORITHM PARAMETER EFFECTS FOR OPTIMIZATION OF CONSTRUCTION RESOURCE ALLOCATION PROBLEM Jin-Lee KIM 1, M. ASCE 1 Assistant Professor, Department of Civil Engineering and Construction
. 1/ CHAPTER- 4 SIMULATION RESULTS & DISCUSSION CHAPTER 4 SIMULATION RESULTS & DISCUSSION 4.1: ANT COLONY OPTIMIZATION BASED ON ESTIMATION OF DISTRIBUTION ACS possesses
Evolutionary Algorithms Software
Evolutionary Algorithms Software Prof. Dr. Rudolf Kruse Pascal Held {kruse,pheld}@iws.cs.uni-magdeburg.de Otto-von-Guericke-Universität Magdeburg Fakultät für Informatik Institut für Wissens- und Sprachverarbeitung
Introduction To Genetic Algorithms
1 Introduction To Genetic Algorithms Dr. Rajib Kumar Bhattacharjya Department of Civil Engineering IIT Guwahati Email: [email protected] References 2 D. E. Goldberg, Genetic Algorithm In Search, Optimization
Performance Optimization of I-4 I 4 Gasoline Engine with Variable Valve Timing Using WAVE/iSIGHT
Performance Optimization of I-4 I 4 Gasoline Engine with Variable Valve Timing Using WAVE/iSIGHT Sean Li, DaimlerChrysler (sl60@dcx dcx.com) Charles Yuan, Engineous Software, Inc ([email protected]) Background!
EFFICIENT DATA PRE-PROCESSING FOR DATA MINING
EFFICIENT DATA PRE-PROCESSING FOR DATA MINING USING NEURAL NETWORKS JothiKumar.R 1, Sivabalan.R.V 2 1 Research scholar, Noorul Islam University, Nagercoil, India Assistant Professor, Adhiparasakthi College
Social Media Mining. Data Mining Essentials
Introduction Data production rate has been increased dramatically (Big Data) and we are able store much more data than before E.g., purchase data, social media data, mobile phone data Businesses and customers
Méta-heuristiques pour l optimisation
Méta-heuristiques pour l optimisation Differential Evolution (DE) Particle Swarm Optimization (PSO) Alain Dutech Equipe MAIA - LORIA - INRIA Nancy, France Web : http://maia.loria.fr Mail : [email protected]
A Study of Local Optima in the Biobjective Travelling Salesman Problem
A Study of Local Optima in the Biobjective Travelling Salesman Problem Luis Paquete, Marco Chiarandini and Thomas Stützle FG Intellektik, Technische Universität Darmstadt, Alexanderstr. 10, Darmstadt,
A Service Revenue-oriented Task Scheduling Model of Cloud Computing
Journal of Information & Computational Science 10:10 (2013) 3153 3161 July 1, 2013 Available at http://www.joics.com A Service Revenue-oriented Task Scheduling Model of Cloud Computing Jianguang Deng a,b,,
International Journal of Emerging Technologies in Computational and Applied Sciences (IJETCAS) www.iasir.net
International Association of Scientific Innovation and Research (IASIR) (An Association Unifying the Sciences, Engineering, and Applied Research) International Journal of Emerging Technologies in Computational
Manjeet Kaur Bhullar, Kiranbir Kaur Department of CSE, GNDU, Amritsar, Punjab, India
Volume 5, Issue 6, June 2015 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Multiple Pheromone
Numerical Research on Distributed Genetic Algorithm with Redundant
Numerical Research on Distributed Genetic Algorithm with Redundant Binary Number 1 Sayori Seto, 2 Akinori Kanasugi 1,2 Graduate School of Engineering, Tokyo Denki University, Japan [email protected],
DECISION TREE INDUCTION FOR FINANCIAL FRAUD DETECTION USING ENSEMBLE LEARNING TECHNIQUES
DECISION TREE INDUCTION FOR FINANCIAL FRAUD DETECTION USING ENSEMBLE LEARNING TECHNIQUES Vijayalakshmi Mahanra Rao 1, Yashwant Prasad Singh 2 Multimedia University, Cyberjaya, MALAYSIA 1 [email protected]
Genetic Algorithm Based Test Data Generator
Genetic Algorithm Based Test Data Generator Irman Hermadi Department of Information and Computer Science King Fahd University of Petroleum & Minerals KFUPM Box # 868, Dhahran 31261, Saudi Arabia [email protected]
Improved PSO-based Task Scheduling Algorithm in Cloud Computing
Journal of Information & Computational Science 9: 13 (2012) 3821 3829 Available at http://www.joics.com Improved PSO-based Tas Scheduling Algorithm in Cloud Computing Shaobin Zhan, Hongying Huo Shenzhen
Hybrid Evolution of Heterogeneous Neural Networks
Hybrid Evolution of Heterogeneous Neural Networks 01001110 01100101 01110101 01110010 01101111 01101110 01101111 01110110 01100001 00100000 01110011 01101011 01110101 01110000 01101001 01101110 01100001
A Novel Web Optimization Technique using Enhanced Particle Swarm Optimization
A Novel Web Optimization Technique using Enhanced Particle Swarm Optimization P.N.Nesarajan Research Scholar, Erode Arts & Science College, Erode M.Venkatachalam, Ph.D Associate Professor & HOD of Electronics,
Biogeography Based Optimization (BBO) Approach for Sensor Selection in Aircraft Engine
Biogeography Based Optimization (BBO) Approach for Sensor Selection in Aircraft Engine V.Hymavathi, B.Abdul Rahim, Fahimuddin.Shaik P.G Scholar, (M.Tech), Department of Electronics and Communication Engineering,
A Reactive Tabu Search for Service Restoration in Electric Power Distribution Systems
IEEE International Conference on Evolutionary Computation May 4-11 1998, Anchorage, Alaska A Reactive Tabu Search for Service Restoration in Electric Power Distribution Systems Sakae Toune, Hiroyuki Fudo,
Branch-and-Price Approach to the Vehicle Routing Problem with Time Windows
TECHNISCHE UNIVERSITEIT EINDHOVEN Branch-and-Price Approach to the Vehicle Routing Problem with Time Windows Lloyd A. Fasting May 2014 Supervisors: dr. M. Firat dr.ir. M.A.A. Boon J. van Twist MSc. Contents
Genetic algorithms for changing environments
Genetic algorithms for changing environments John J. Grefenstette Navy Center for Applied Research in Artificial Intelligence, Naval Research Laboratory, Washington, DC 375, USA [email protected] Abstract
A No el Probability Binary Particle Swarm Optimization Algorithm and Its Application
28 JOURAL OF SOFTWARE, VOL. 3, O. 9, DECEMBER 2008 A o el Probability Binary Particle Swarm Optimization Algorithm and Its Application Ling Wang* School of Mechatronics and Automation, Shanghai University,
Using Data Mining for Mobile Communication Clustering and Characterization
Using Data Mining for Mobile Communication Clustering and Characterization A. Bascacov *, C. Cernazanu ** and M. Marcu ** * Lasting Software, Timisoara, Romania ** Politehnica University of Timisoara/Computer
How Can Metaheuristics Help Software Engineers
and Software How Can Help Software Engineers Enrique Alba [email protected] http://www.lcc.uma.es/~eat Universidad de Málaga, ESPAÑA Enrique Alba How Can Help Software Engineers of 8 and Software What s a
PERFORMANCE STUDY AND SIMULATION OF AN ANYCAST PROTOCOL FOR WIRELESS MOBILE AD HOC NETWORKS
PERFORMANCE STUDY AND SIMULATION OF AN ANYCAST PROTOCOL FOR WIRELESS MOBILE AD HOC NETWORKS Reza Azizi Engineering Department, Bojnourd Branch, Islamic Azad University, Bojnourd, Iran [email protected]
Swarm Intelligence Algorithms Parameter Tuning
Swarm Intelligence Algorithms Parameter Tuning Milan TUBA Faculty of Computer Science Megatrend University of Belgrade Bulevar umetnosti 29, N. Belgrade SERBIA [email protected] Abstract: - Nature inspired
A Case Retrieval Method for Knowledge-Based Software Process Tailoring Using Structural Similarity
A Case Retrieval Method for Knowledge-Based Software Process Tailoring Using Structural Similarity Dongwon Kang 1, In-Gwon Song 1, Seunghun Park 1, Doo-Hwan Bae 1, Hoon-Kyu Kim 2, and Nobok Lee 2 1 Department
QoS Guaranteed Intelligent Routing Using Hybrid PSO-GA in Wireless Mesh Networks
BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 15, No 1 Sofia 2015 Print ISSN: 1311-9702; Online ISSN: 1314-4081 DOI: 10.1515/cait-2015-0007 QoS Guaranteed Intelligent Routing
New Modifications of Selection Operator in Genetic Algorithms for the Traveling Salesman Problem
New Modifications of Selection Operator in Genetic Algorithms for the Traveling Salesman Problem Radovic, Marija; and Milutinovic, Veljko Abstract One of the algorithms used for solving Traveling Salesman
Genetic Algorithm Based Interconnection Network Topology Optimization Analysis
Genetic Algorithm Based Interconnection Network Topology Optimization Analysis 1 WANG Peng, 2 Wang XueFei, 3 Wu YaMing 1,3 College of Information Engineering, Suihua University, Suihua Heilongjiang, 152061
Minimizing Response Time for Scheduled Tasks Using the Improved Particle Swarm Optimization Algorithm in a Cloud Computing Environment
Minimizing Response Time for Scheduled Tasks Using the Improved Particle Swarm Optimization Algorithm in a Cloud Computing Environment by Maryam Houtinezhad, Department of Computer Engineering, Artificial
Component visualization methods for large legacy software in C/C++
Annales Mathematicae et Informaticae 44 (2015) pp. 23 33 http://ami.ektf.hu Component visualization methods for large legacy software in C/C++ Máté Cserép a, Dániel Krupp b a Eötvös Loránd University [email protected]
A Stock Pattern Recognition Algorithm Based on Neural Networks
A Stock Pattern Recognition Algorithm Based on Neural Networks Xinyu Guo [email protected] Xun Liang [email protected] Xiang Li [email protected] Abstract pattern respectively. Recent
Proposed Software Testing Using Intelligent techniques (Intelligent Water Drop (IWD) and Ant Colony Optimization Algorithm (ACO))
www.ijcsi.org 91 Proposed Software Testing Using Intelligent techniques (Intelligent Water Drop (IWD) and Ant Colony Optimization Algorithm (ACO)) Laheeb M. Alzubaidy 1, Baraa S. Alhafid 2 1 Software Engineering,
One Rank Cuckoo Search Algorithm with Application to Algorithmic Trading Systems Optimization
One Rank Cuckoo Search Algorithm with Application to Algorithmic Trading Systems Optimization Ahmed S. Tawfik Department of Computer Science, Faculty of Computers and Information, Cairo University Giza,
A STUDY ON DATA MINING INVESTIGATING ITS METHODS, APPROACHES AND APPLICATIONS
A STUDY ON DATA MINING INVESTIGATING ITS METHODS, APPROACHES AND APPLICATIONS Mrs. Jyoti Nawade 1, Dr. Balaji D 2, Mr. Pravin Nawade 3 1 Lecturer, JSPM S Bhivrabai Sawant Polytechnic, Pune (India) 2 Assistant
Genetic algorithm evolved agent-based equity trading using Technical Analysis and the Capital Asset Pricing Model
Genetic algorithm evolved agent-based equity trading using Technical Analysis and the Capital Asset Pricing Model Cyril Schoreels and Jonathan M. Garibaldi Automated Scheduling, Optimisation and Planning
Projects - Neural and Evolutionary Computing
Projects - Neural and Evolutionary Computing 2014-2015 I. Application oriented topics 1. Task scheduling in distributed systems. The aim is to assign a set of (independent or correlated) tasks to some
Make Better Decisions with Optimization
ABSTRACT Paper SAS1785-2015 Make Better Decisions with Optimization David R. Duling, SAS Institute Inc. Automated decision making systems are now found everywhere, from your bank to your government to
A Comparison of Genotype Representations to Acquire Stock Trading Strategy Using Genetic Algorithms
2009 International Conference on Adaptive and Intelligent Systems A Comparison of Genotype Representations to Acquire Stock Trading Strategy Using Genetic Algorithms Kazuhiro Matsui Dept. of Computer Science
Neural Network and Genetic Algorithm Based Trading Systems. Donn S. Fishbein, MD, PhD Neuroquant.com
Neural Network and Genetic Algorithm Based Trading Systems Donn S. Fishbein, MD, PhD Neuroquant.com Consider the challenge of constructing a financial market trading system using commonly available technical
Gerard Mc Nulty Systems Optimisation Ltd [email protected]/0876697867 BA.,B.A.I.,C.Eng.,F.I.E.I
Gerard Mc Nulty Systems Optimisation Ltd [email protected]/0876697867 BA.,B.A.I.,C.Eng.,F.I.E.I Data is Important because it: Helps in Corporate Aims Basis of Business Decisions Engineering Decisions Energy
New Hash Function Construction for Textual and Geometric Data Retrieval
Latest Trends on Computers, Vol., pp.483-489, ISBN 978-96-474-3-4, ISSN 79-45, CSCC conference, Corfu, Greece, New Hash Function Construction for Textual and Geometric Data Retrieval Václav Skala, Jan
