Statistical Analysis and Comparison of the Performance of Meta-Heuristic Methods Based on their Powerfulness and Effectiveness
محورهای موضوعی : Machine learningMehrdad Rohani 1 , Hassan Farsi 2 , Seyed Hamid Zahiri 3
1 - Univerrsity of Birjand
2 - University of Birjand
3 - University of Birjand
کلید واژه: Effectiveness, Meta-heuristic Algorithms, Optimization, Powerfulness, Statistical Analysis.,
چکیده مقاله :
In this paper, the performance of meta-heuristic algorithms is compared using statistical analysis based on new criteria (powerfulness and effectiveness). Due to the large number of meta-heuristic methods reported so far, choosing one of them by researchers has always been challenging. In fact, the user does not know which of these methods are able to solve his complex problem. In this paper, in order to compare the performance of several methods from different categories of meta-heuristic methods new criteria are proposed. In fact, by using these criteria, the user is able to choose an effective method for his problem. For this reason, statistical analysis is conducted on each of these methods to clarify the application of each of these methods for the users. Also, powerfulness and effectiveness criteria are defined to compare the performance of the meta-heuristic methods to introduce suitable substrate and suitable quantitative parameters for this purpose. The results of these criteria clearly show the ability of each method for different applications and problems.
In this paper, the performance of meta-heuristic algorithms is compared using statistical analysis based on new criteria (powerfulness and effectiveness). Due to the large number of meta-heuristic methods reported so far, choosing one of them by researchers has always been challenging. In fact, the user does not know which of these methods are able to solve his complex problem. In this paper, in order to compare the performance of several methods from different categories of meta-heuristic methods new criteria are proposed. In fact, by using these criteria, the user is able to choose an effective method for his problem. For this reason, statistical analysis is conducted on each of these methods to clarify the application of each of these methods for the users. Also, powerfulness and effectiveness criteria are defined to compare the performance of the meta-heuristic methods to introduce suitable substrate and suitable quantitative parameters for this purpose. The results of these criteria clearly show the ability of each method for different applications and problems.
[1] R. Bellman, "Dynamic Programming", Science, Vol. 153, No. 3731, 1966, pp. 34-37.
[2] W. Kuo, V. R. Prasad, F. A. Tillman, and C.-L. Hwang, Optimal Reliability Design: Fundamentals and Applications, Cambridge University Press, 2001.
[3] J. A. Snyman, Practical Mathematical Optimization. Springer, 2005.
[4] I. BoussaïD, J. Lepagnot, and P. Siarry, "A Survey on Optimization Metaheuristics", Information Sciences, Vol. 237, 2013, pp. 82-117.
[5] A. Sezavar, H. Farsi, and S. Mohamadzadeh, "A Modified Grasshopper Optimization Algorithm Combined with CNN for Content Based Image Retrieval", International Journal of Engineering, Vol. 32, No. 7, 2019, pp. 924-930.
[6] A. H. Hosseinian and V. Baradaran, "A Multi-Objective Multi-Agent Optimization Algorithm for the Community Detection Problem", J. Inform. Syst. Telecommun, Vol. 6, No. 1,2019, pp. 169-179.
[7] S. Kirkpatrick, C. D. Gelatt, and M. P. Vecchi, "Optimization By Simulated Annealing", Science, Vol. 220, No. 4598, 1983, pp. 671-680.
[8] J. R. Koza and J. R. Koza, Genetic Programming, On the Programming of Computers :Natural Selection MIT press, 1992.
[9] A. Walker, J. Hallam, and D. Willshaw, "Bee-Havior in a Mobile Robot: The Construction of a Self-Organized Cognitive Map and Its Use in Robot Navigation within a Complex, Natural Environment", IEEE International Conference on Neural Networks, 1993, pp. 1451-1456.
[10] F. Glover, "Tabu Search for Nonlinear and Parametric Optimization (With Links to Genetic Algorithms)", Discrete Applied Mathematics, Vol. 49, No. 1-3, 1994, pp. 231-255.
[11] J. Kennedy and R. Eberhart, "Particle Swarm Optimization", in Proceedings of ICNN'95-International Conference on Neural Networks, 1995, Vol. 4, pp. 1942-1948.
[12] K. M. Passino, "Biomimicry of Bacterial Foraging for Distributed Optimization and Control", IEEE control systems magazine, Vol. 22, No. 3, 2002, pp. 52-67.
[13] D. Simon, "Biogeography-Based Optimization", IEEE Transactions on Evolutionary Computation, Vol. 12, No. 6, 2008, pp. 702-713.
[14] S. Mirjalili, S. M. Mirjalili, and A. Lewis, "Grey Wolf Optimizer", Advances in Engineering Software, Vol. 69, 2014, pp. 46-61.
[15] S. Mirjalili, "The Ant Lion Optimizer", Advances in Engineering Software, Vol. 83, 2015, pp. 80-98.
[16] S. Mirjalili, "Moth-flame Optimization Algorithm: A Novel Nature-Inspired Heuristic Paradigm", Knowledge-Based Systems, Vol. 89, 2015, pp. 228-249.
[17] S. Mirjalili "Dragonfly Algorithm: a New Meta-Heuristic Optimization Technique for Solving Single-Objective, Discrete, and Multi-Objective Problems", Neural Computing and Applications, Vol. 27, No. 4, 2016, pp. 1053-1073.
[18] S. Mirjalili, S. M. Mirjalili, and A. Hatamlou, "Multi-Verse Optimizer: a Nature-Inspired Algorithm for Global Optimization", Neural Computing and Applications, Vol. 27, No. 2, pp. 495-513, 2016.
[19] S. Mirjalili, "SCA: a Sine Cosine Algorithm for Solving Optimization Problems", Knowledge-Based Systems, Vol. 96, 2016, pp. 120-133.
[20] S. Mirjalili and A. Lewis, "The Whale Optimization Algorithm", Advances in Engineering Software, Vol. 95, 2016, pp. 51-67.
[21] S. Mirjalili, A. H. Gandomi, S. Z. Mirjalili, S. Saremi, H. Faris, and S. M. Mirjalili, "Salp Swarm Algorithm: A Bio-Unspired Optimizer for Engineering Design Problems", Advances in Engineering Software, Vol. 114, 2017, pp. 163-191.
[22] S. M. Almufti, "Historical Survey on Metaheuristics Algorithms", International Journal of Scientific World, Vol. 7, No. 1, 2019, pp. 1.
[23] S. Shirke and R. Udayakumar, "Evaluation of Crow Search Algorithm (CSA) for Optimization in Discrete Applications", International Conference on Trends in Electronics and Informatics (ICOEI), 2019, pp. 584-589.
[24] M. Dorigo and G. Di Caro, "Ant Colony Optimization: a New Meta-Heuristic", in Proceedings of the Congress on Evolutionary Computation-CEC99, Vol. 2, 1999, pp. 1470-1477.
[25] M. Clerc, Particle Swarm Optimization. John Wiley & Sons, 2010.
[26] J. G. Digalakis and K. G. Margaritis, "On Benchmarking Functions for Genetic Algorithms", International Journal of Computer Mathematics, Vol. 77, No. 4, 2001, pp. 481-506.
[27] M. Molga and C. Smutnicki, "Test Functions for Optimization Needs", Test Functions for Optimization Needs, Vol. 101, 2005, pp. 48.
[28] X.-S. Yang, "Firefly Algorithm, Stochastic Test Functions and Design Optimisation," International Journal of Bio-Inspired Computation, Vol. 2, No. 2, 2010, pp. 78-84.
[29] D. Molina, J. Poyatos, J. Del Ser, S. García, A. Hussain, and F. Herrera, "Comprehensive Taxonomies of Nature-and Bio-inspired Optimization: Inspiration Versus Algorithmic Behavior, Critical Analysis Recommendations", Cognitive Computation, Vol. 12, No. 5, 2020, pp. 897-9339.
Journal of Information Systems and Telecommunication
|
http://jist.acecr.org ISSN 2322-1437 / EISSN:2345-2773 |
Statistical Analysis and Comparison of the Performance of Meta-Heuristic Methods Based on their Powerfulness and Effectiveness |
Mehrdad Rohani1*, Hassan Farsi1, Seyed Hamid Zahiri1
|
1.Department of Electronic and Computer Engineering, University of Birjand, Birjand, Iran
|
Received: 04 May 2021 / Revised: 24 Jul 2021/ Accepted: 24 Aug 2021 |
DOI: https://doi.org/10.52547/jist.16067.10.37.49 |
Abstract
In this paper, the performance of meta-heuristic algorithms is compared using statistical analysis based on new criteria (powerfulness and effectiveness). Due to the large number of meta-heuristic methods reported so far, choosing one of them by researchers has always been challenging. In fact, the user does not know which of these methods are able to solve his complex problem. In this paper, in order to compare the performance of several methods from different categories of meta-heuristic methods new criteria are proposed. In fact, by using these criteria, the user is able to choose an effective method for his problem.
For this reason, statistical analysis is conducted on each of these methods to clarify the application of each of these methods for the users. Also, powerfulness and effectiveness criteria are defined to compare the performance of the meta-heuristic methods to introduce suitable substrate and suitable quantitative parameters for this purpose. The results of these criteria clearly show the ability of each method for different applications and problems.
Keywords: Effectiveness; Meta-heuristic Algorithms; Optimization; Powerfulness; Statistical Analysis.
1- Introduction
Optimization is conducting a process to find the best acceptable answer by considering the limits and requirements of the issue. To solve an optimization issue, there might be different answers for the desired optimal parameter, in which function, namely the goal function, is defined to compare these answers and choosing an optimal answer. An optimization method must be able to extract the optimal answer for this function. The advance of the computer in the last five decades leads to the improvement of the optimization methods. Each of these methods has a different ability to solve an optimization problem. However, due to existing the great number of optimization methods, the most important question that arises is which method is suitable, and provides the best performance for solving the problem. These methods can be classified into three broad categories: Enumeration methods, calculates-based methods, and random methods, which are explained in the following.
Enumeration methods: In each iteration, only one point t belonging to the answer space is examined. This category of methods is simpler than the others in terms of implementation but needs considerable calculations. In these methods, no mechanism exists to decrease the scope of the search space, and the scope of the search space is very large. For instance, dynamic programming is an example of these methods that acts completely unintelligent [1, 2].
Calculates-based methods: In these methods, the set of necessary and sufficient conditions are used that apply to the answer to the problem. These methods usually use the gradients of the goal to search. It is possible that sometimes, due to discontinuity of the goal function. Its gradients cannot be calculable. Therefore, these methods also face challenges [3].
Random methods: One of the uncommon methods to find the optimal answer for a problem is to consider all of the possible answers. In this case, the goal function is calculated for all of the possible answers and at the end, the best answer is selected. In this case, the complete count leads to the exact answer to the problem. Using this method in practical problems, is not possible due to the vast range of possible answers for the problem. Given this issue, the effort is always to present methods that have the ability to decrease the search space. To solve this problem, random search methods such as heuristic methods and meta-heuristic methods are presented. This category of methods is able to present a proper answer and close to the optimal answer in a limited time and unlike enumeration and calculates based methods. Random search methods are mainly based on enumeration methods except that they use additional information to guide the search. These methods are completely general in terms of the application area. They are able to solve very complex problems.
The main problem of the heuristic methods is that they get caught in local optimal points and early converge to these points. Meta-heuristic methods are presented to solve this problem [4]. In fact, meta-heuristic methods belong to the category of methods that have a solution to exit local optimal points. These methods are able to be used in a broad range of problems. Optimization methods are used in various fields [5, 6]. The user chooses one of the methods based on the application.
In this paper, we try to examine a set of methods and present statistical analysis. For this reason we determine the stability of each method and their real-time performance. Also, two new criteria powerfulness and effectiveness have been introduced and the performance of each methods has been examined by these two new criteria.
In the next section, meta-heuristic methods from different categories are introduced. In the third section, benchmarks as well as conventional evaluation criteria (Best fitness, average run time and standard deviation) are introduced. Also, new criteria and statistical analysis have been introduced in this section. In the fourth section, the tests and results are reviewed.
2- Meta-Heuristic Methods
During the last three decades, the introduction of new meta-heuristic methods and also their application in different devices has been considerably increasing. In 1983, Kirkpatrick proposed the simulated annealing method [7]. In 1992, Koza introduced the genetic programming method [8]. After that Walker et al introduced the first algorithm based on bee colonies in 1993 [9]. In 1994, the term of meta-heuristic was used by Golver when introducing the tabu search method [10]. In 1995, Kennedy and Eberhat introduced particle swarm optimization [11]. In 2002, Passino introduced the bacterial foraging-based method [12]. In 2008, a bio-geography-based optimization algorithm was introduced by Simon [13]. Many methods in this area have been presented in the last decade. In 2014, the grey wolf algorithm was introduced by Mirjalili et al. [14]. In 2015, the Ant Lion Optimizer (ALO) was presented by Mirjalili [15]. In the same year, the moth-flame optimization algorithm was presented by Mirjalili [14]. In 2016, Mirjalili introduced dragonfly (DA), multi-verse (MVO), sine cosine (SCA), and whale optimization (WOA) algorithms for optimization [17-20]. In 2017, the salp swarm algorithm (SSA) was presented by Mirjalili [21]. Development of these algorithms has occurred at a high speed in recent years.
These methods are able to exit the local optimal answers and move to the global optimal answer in a short period of time. The important factor in these methods is the dynamic balance between the exploration and exploitation strategies. Exploration is able to properly search the answer space. Exploitation strategy performs the search operation in spaces with more possibility and prevents the loss of time in search space in which the possibility of the optimal answer is low. Meta-heuristic methods are divided by categories of methods including methods based on single-point and based on population, inspired by nature and without inspiration by nature, with memory and memoryless, and probabilistic-definitive. Some of the meta-heuristic methods are memoryless. They do not use the information obtained during the search. Some of them take advantage of the obtained information. Single point-based methods change an answer during the search process, while population-based methods consider a set of answers during the search [22]. Generally, this type of method is slower comparing to the single-point methods but they are able to produce more desirable answers. However, due to the advance in computer calculation power, the population-based methods hold more importance. Among these methods, WOA, GWO, DA and SSA can be pointed out. All of these methods belong to the category of nature-inspired methods which have memory and they are also based on population. Another category of these methods is presented in figure 1 [23].
2-1- Whale Optimization Algorithm (WOA)
This method is inspired by social behavior, the mechanism of which is based on the social movement of humpback whales and how they hunt. Humpback whales are able to identify the location of the prey [20]. The primary location of the search agents in modeling the algorithm is considered to be close to the desired situation, and after determining the best search agent, other agents update their location according to that. This behavior is modeled through the Eq. (1) and Eq. (2). All the equations in this subsection are adopted from [20].
| (1) | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| (2) |
Fig. 1. Classification of meta-heuristic algorithms [23]
|
Where t is the current iteration, and are the coefficient vectors, is the location vectors for the best response obtained in the t iteration and is the location vector. Also, must be updated in each iteration in case of the existence of a better answer. Coefficient vectors and are calculated with Eq. (3) and Eq. (4).
| (3) | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| (4) |
| (5) | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| (6) |
| (7) |
Search for prey (exploitation phase): In this method, random selection is used to update the search agents to let the algorithm perform a global search in the search space, which is modeled by Eq. (8) and Eq. (9).
| (8) | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| (9) |
| (10) | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| (11) |
| (12) | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| (13) |
| (14) | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| (15) |
| (16) |
Finally, after finishing the iterations, the location of the alpha wolf will be selected as the optimum point.
2-3- Dragonfly Optimization Algorithm (DA)
The dragonfly algorithm is based on mass intelligence and inspired by nature. The main idea is based on the behavior of dragonflies while hunting for food and prey [17]. The mass behavior and mass formation of the dragonflies are performed for two purposes: prey which is called the static mass or nutrition, and the immigrant or dynamic mass. These behaviors are modeled by the Eq. (17). All the equations in this subsection are adopted from [17].
| (17) |
Where is separation, is alignment, and is cohesion. is current location, is the number of neighbors, is the th neighbor, and is the speed of the th neighbor. The location of the food (goal) and the enemy (search agent) is modeled by Eq. (18) and Eq. (19).
| (18) |
Where is the location of the search agent. is the location of the food source, and is the location of the enemy. The behavior of the dragonflies is modeled by Eq. (19).
| (19) |
, , , and are the weight values for adjusting the exploration and exploitation processes. The parameter is the weight of inertia and is the iteration counter. The position of each search agent is expressed according to the Eq. (20).
| (20) |
To improve the exploration in the search space and modeling the static behavior of the dragonflies, when there is no neighbor the random walking process is added to this algorithm to update the location of the dragonflies according to the Eq. (21).
| (21) |
| (22) |
Finally, the location of the dragonflies is updated based on the two static and dynamic behaviors and the best answer is selected as the optimal answer to the problem [17].
2-4- Salp Swarm Optimization Algorithm (SSA)
This method is inspired by the social behavior of the salps. These creatures move in the deep waters in groups and under the name of salp chain. This behavior, as some researchers believe, is for better movement and fast access to food [21]. To model the behavior of the salps, they are divided into two groups of leader and followers. The leader is the first member of the salps chain and the others are called the followers. The food source for the salps is known as the matrix, and the location of the salps is modeled by the Eq. (23). All the equations in this subsection are adopted from [21].
| (23) |
Where is the location of the first salp (leader) in the jth dimension, is the location of the food in the th dimension, is the upper bound of the jth dimension, and is the lower bound of the jth dimension. The parameters and are random numbers between 0 and 1, but the parameter has an important role, namely exploration, in the search space, and is modeled by Eq. (24).
| (24) |
Where is the running iteration and is the maximum iteration. The location of the follower salps is expressed by Eq. (25).
| (25) |
Where and is the location of the th follower salp in the jth dimension. In this way, the SSA algorithm updates the location of the leader and its followers in each iteration and introduces the best location as the best answer in the last iteration [21].
3- Stability Analysis of Meta-Heuristic Methods Based on Statistical Methods
So far, different methods have been reported to evaluate the stability of meta-heuristic methods based on statistical analysis. In this section, we introduce the proposed method while referring to the methods that have been expressed so far in research.
3-1- Conventional Stability Analysis Methods
Stability analysis of the meta-heuristic methods is performed using two methods of mathematical analysis and statistical analysis. Dorigo in the optimization method of ants ACO proved by limiting the pheromone that the ant method converges to the optimal answer, and the method will be stable [24]. Clerc performed statistical analysis for the particle swarm method and guaranteed the convergence under some conditions for the available parameters in the problem [25]. In this research, Monte Carlo evaluation is used to analyze the stability of the meta-heuristic methods. This analysis includes the examination of convergence in different optimization problems. Also, the ability of each method in encountering each of these problems is determined with high confidence through many iterations. The time parameter is also calculated for each algorithm so that a user can have the ability to choose a method based on online and offline applications. Also, two new metrics namely powerfulness and effectiveness are introduced to examine the convergence and power of a method in obtaining the global answer.
3-1-1- Benchmark Functions
Researchers face varies problems with different complexities. They use different meta-heuristic algorithms to solve and optimize their problems. We have used 23 benchmark functions with different complexities including unimodal, multimodal, and fixed-dimension. The performance of each algorithm has been tested by solving these 23 problems [26-28]. These functions are presented in table 1-3. Fmin indicates the optimum point in the search space. Table 1 indicates the F1-F7 functions that have one global optimum point, and search the exploitation process in the search space, and test the performance of a method. Benchmark functions of F8-F13 are multimodal functions and they are shown in table 2. This category has several local optimum points, and these local optimum points exponentially increase with respect to the increase in the dimension. These functions are perfectly capable of test the exploration process in the search space. The third category is the fixed-dimension multimodal. These functions also have one global optimum point and several local optimum points that, similar to the second category, analyze the effectiveness and powerfulness of a method. These are shown in table 3. Figure 2 indicates the 3-D representation of the benchmark functions’ search space.
Table 1: Description of unimodal benchmark
Function | Range |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| [-100,100] | 0 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| [-10,10] | 0 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| [-100,100] | 0 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| [-100,100] | 0 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| [-30,30] | 0 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| [-100,100] | 0 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| [-1.28,1.28] | 0 |
Fig. 2. Search space of composite benchmark functions. |
Table 2: Description of multimodal benchmark
Function | Range |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| [-500,500] |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| [-5.12,5.12] | 0 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| [-32,32] | 0 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| [-600,600] | 0 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| [-50,50] |
0 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| [-50,50] | 0 |
Function | Range |
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| [-65, 65] | 1 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| [-5,5] | 0.00030 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| [-5,5] | -1.0316 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| [-5,5] | 0.398 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| [-2,2] | 3 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| [0,1] | -3.86 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| [0,1] | -3.32 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| [0,10] | -10.1532 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| [0,10] | -10.4028 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| [0,10] | -10.5363 |
| (26) | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| (27) |
| (28) | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| (29) |
Fig.3. Convergence curves of WOA, GWO, DA and SSA algorithms that obtained from F4, F9 and F15 |
Table 4: WOA algorithm
Table 5: GWO algorithm
|
Table 6: WOA algorithm
Table 7: GWO algorithm
|
Table 8: Average PF criterion
| Function type | ||
Algorithm | Unimodal | Multi modal | Fixed-dimension multimodal |
WOA | 59.85 | 83.33 | 80.70 |
GWO | 85.71 | 83.33 | 99.80 |
DA | 100 | 83.33 | 100 |
SSA | 85.71 | 100 | 100 |
Besides the powerfulness metric, the convergence rate of each method on each benchmark function can be examined by convergence graphs depicted using red color in figure 3. This test is performed visually. By observing any red convergence graph it can be realized that how cohesive an algorithm behaves on a benchmark function. If the graphs are drawn coherently and close to each other, the convergence rate is high for example GWO and SSA methods on F4 function. On the contrary, if the drawn lines are at a distance from each other, it shows the different behavior of each algorithm on a benchmark function. This means that an algorithm results in different answers in each run and the reliability on that benchmark function is decreased by that algorithm for example WOA algorithm on F4 functions.
Average EF criterion
In this section, average values EF are shown in table 9. The first column indicates the average values for the unimodal function, in which the SSA method has the highest value with 71%. This means that the SSA method has provided the highest value in obtaining the global optimum answer. In this column, the DA method has the weakest performance to obtain the global answer with only 0.28%. The second column indicates the performance of the methods in multi-modal functions. In this column, the WOA method has the best performance with 49.83%. Also, the DA method is known as the weakest method with 0.50%. The third column is calculated for fixed-dimension multimodal functions. In this column, the SSA method results in the best performance with 30.6%. Also, the DA method provides the lowest value with 8.80%.
Table 9: Average EN criterion
| Function type | ||
Algorithm | Unimodal | Multi modal | Fixed-dimension multimodal |
WOA | 28.57 | 49.83 | 22.60 |
GWO | 58.00 | 46.66 | 28.60 |
DA | 0.28 | 0.50 | 8.80 |
SSA | 71 | 35.33 | 30.6 |
Average run time
One of the important matters in engineering discussions is to obtain the desired answer in the shortest time. Thus, the average run time of each algorithm is investigated in Table 10. The average time is calculated on one type of benchmark function to test the powerfulness of each method in run time. In the first column, the average run time on the unimodal functions is calculated that includes F1-F7 functions. The fastest algorithm is the WOA algorithm with 0.1091 seconds while the DA method is the slowest method with 9.6333 seconds. In the multi-modal functions that include F8-F13 functions, the WOA algorithm is the best algorithm with 0.1267 seconds, and the DA algorithm has the weakest performance with 12.6949 seconds. In the third column, the fixed-dimension multimodal functions that include F14-F23 functions are present. In this column, the WOA algorithm is the fastest algorithm with 0.0556 seconds while the DA method is the slowest method in terms of runtime with only 6.3046 seconds. The obtained results show that the DA method is the slowest algorithm in comparison with the other methods.
Table 10: Average run time (sec)
* Hassan Farsi
Hfarsi@Birjand.ac.ir