# Description: The Griewank test function is multimodal and non-separable, with has several local optima within the search region defined by [-600, 600]. It is similar to the Rastrigin function, but the number of local optima is larger in this case. It only has one global optimum located at the point \kbd{o=(0,...,0)}. While this function has an exponentially increasing number of local minima as its dimension increases, it turns out that a simple multistart algorithm is able to detect its global minimum more and more easily as the dimension increases (Locatelli, 2003)
# Description: The Griewank test function is multimodal and non-separable, with has several local optima within the search region defined by [-600, 600]. It is similar to the Rastrigin function, but the number of local optima is larger in this case. It only has one global optimum located at the point \kbd{o=(0,...,0)}. While this function has an exponentially increasing number of local minima as its dimension increases, it turns out that a simple multistart algorithm is able to detect its global minimum more and more easily as the dimension increases (Locatelli, 2003)
The \bold{Griewank} test function is multimodal and non-separable, with has several local optima within the search region defined by [-600, 600]. It is similar to the Rastrigin function, but the number of local optima is larger in this case. It only has one global optimum located at the point \kbd{o=(0,...,0)}. While this function has an exponentially increasing number of local minima as its dimension increases, it turns out that a simple multistart algorithm is able to detect its global minimum more and more easily as the dimension increases (Locatelli, 2003). It is defined by:
The \bold{Griewank} test function is multimodal and non-separable, with has several local optima within the search region defined by [-600, 600]. It is similar to the Rastrigin function, but the number of local optima is larger in this case. It only has one global optimum located at the point \kbd{o=(0,...,0)}. The function interpretation changes with the scale; the general overview suggests convex function, medium-scale view suggests existence of local extremum, and finally zoom on the details indicates complex structure of numerous local minima. While this function has an exponentially increasing number of local minima as its dimension increases, it turns out that a simple multistart algorithm is able to detect its global minimum more and more easily as the dimension increases (Locatelli, 2003). It is defined by:
The main difficulty of the \bold{Schaffer's F6} test function is that the size of the potential maxima that need to be overcomed to get to a minimum increases the closer
one gets to the global minimum. It is defined by:
The \bold{Sphere} test function is one of the most simple test functions available in the specialized literature. This unimodal and additively separable test function can be scaled up to any number of variables. It belongs to a family of functions called quadratic functions and only has one optimum in the point \kbd{o=(0,...,0)}. The search range commonly used for the Sphere function is [-100, 100] for each decision variable. It is defined by:
The \emph{first function of De Jong's} or \bold{Sphere} function is one of the most simple test functions available in the specialized literature. This continuous, convex, unimodal and additively separable test function can be scaled up to any number of variables. It belongs to a family of functions called quadratic functions and only has one optimum in the point \kbd{o=(0,...,0)}. The search range commonly used for the Sphere function is [-100, 100] for each decision variable. It is defined by:
The \bold{Schwefel's} function is non-convex, multimodal, and additively separable. It is deceptive in that the global minimum is geometrically distant, over the parameter space, from the next best local minima. Therefore, the search algorithms are potentially prone to convergence in the wrong direction. In addition, it is less symmetric than the Rastrigin function and has the global minimum at the edge of the search space [-500, 500] at position \kbd{o=(420.9687,...,420.9687)}. Additionally, there is no overall, guiding slope towards the global minimum like in Ackley's, or less extreme, in Rastrigin's function. It is defined by:
Some optimisation algorithms take advantage of the known property of the benchmark functions, such as local optima lying along the coordinate axes, global optimum having the same values or many variables and so on. In order to avoid the previous shortcomings, shifting vector and a single bias is introduced for some benchmark functions, reported afterwards.
@@ -97,33 +116,38 @@ Each test function returns a single numeric value corresponding to the function
...
@@ -97,33 +116,38 @@ Each test function returns a single numeric value corresponding to the function
}
}
\references{
\references{
GEATbx: Example Functions (single and multi-objective functions) \cr
\cite{Dieterich, J.M. and B.Hartke. 2012. Empirical review of standard benchmark functions using evolutionary global optimization. Appl.Math. 3. 1552-1564, DOI:10.4236/am.2012.330215}\cr
\cite{Barrera, J., and C. Coello Coello. 2010, Test function generators for assessing the performance of PSO algorithms in multimodal optimization, in Handbook of Swarm Intelligence, vol. 8, edited by B. Panigrahi, Y. Shi, and M.-H. Lim, chap. Adaptation, Learning, and Optimization, pp. 89-117, Springer Berlin Heidelberg, doi:10.1007/978-3-642-17390-5 4}\cr
Problem Definitions and Evaluation Criteria for the CEC 2005 Special Session on Real-Parameter Optimization \cite{\url{www.lri.fr/~hansen/Tech-Report-May-30-05.pdf}}\cr
Problem Definitions and Evaluation Criteria for the CEC 2005 Special Session on Real-Parameter Optimization \cr
Test functions for optimization needs: \cite{\url{http://www.zsd.ict.pwr.wroc.pl/files/docs/functions.pdf}}\cr
\cite{Barrera, J., and C. Coello Coello. 2010, Test function generators for assessing the performance of PSO algorithms in multimodal optimization, in Handbook of Swarm Intelligence, vol. 8, edited by B. Panigrahi, Y. Shi, and M.-H. Lim, chap. Adaptation, Learning, and Optimization, pp. 89-117, Springer Berlin Heidelberg, doi:10.1007/978-3-642-17390-5 4}
Test Functions for Unconstrained Global Optimization \cite{\url{http://www-optima.amp.i.kyoto-u.ac.jp/member/student/hedar/Hedar_files/TestGO_files/Page364.htm}}\cr
Griewank: \cite{Locatelli, M. 2003. A note on the griewank test function, Journal of Global Optimization, 25 (2), 169-174, doi:10.1023/A:1021956306041}
Griewank: \cite{Locatelli, M. 2003. A note on the griewank test function, Journal of Global Optimization, 25 (2), 169-174, doi:10.1023/A:1021956306041}\cr
Schaffer's F6 \cite{Xiaohong Qiu, Jun Liu. 2009. A Novel Adaptive PSO Algorithm on Schaffer's F6 Function. Hybrid Intelligent Systems, International Conference on, pp. 94-98, 2009 Ninth International Conference on Hybrid Intelligent Systems} \cr