Abstract
Performance comparison among several algorithms is an essential task. This is already a difficult process when dealing with stationary problems where the researcher usually tests many algorithms, with several parameters, under different problems. The situation is even more complex when dynamic optimization problems are considered, since additional dynamism-specific configurations should also be analyzed (e.g. severity, frequency and type of the changes, etc). In this work, we present a technique to compact those results in a visual way, improving their understanding and providing an easy way to detect algorithms’ behavioral patterns. However, as every form of compression, it implies the loss of part of the information. The pros and cons of this technique are explained, with a special emphasis on some statistical issues that commonly arise when dealing with random-nature algorithms.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Bartz-Beielstein, T.: Experimental Research in Evolutionary Computation: The New Experimentalism. Natural Computing Series. Springer, Heidelberg (2006), doi:10.1007/3-540-32027-X
Bartz-Beielstein, T., Chiarandini, M., Paquete, L., Preuss, M. (eds.): Experimental Methods for the Analysis of Optimization Algorithms. Springer, Heidelberg (2010), doi:10.1007/978-3-642-02538-9
Branke, J.: Memory enhanced evolutionary algorithms for changing optimization problems. In: Proceedings of the 1999 IEEE Congress on Evolutionary Computation (CEC 1999), vol. 3, pp. 1875–1882. IEEE (1999), doi:10.1109/CEC.1999.785502
Branke, J.: Evolutionary Optimization in Dynamic Environments. Genetic algorithms and evolutionary computation, vol. 3. Kluwer Academic Publishers, Massachusetts (2001)
Chen, C.-H., Härdle, W., Unwin, A., Friendly, M.: Handbook of Data Visualization. Springer Handbooks of Computational Statistics. Springer, Heidelberg (2008), doi:10.1007/978-3-540-33037-0
Cruz, C., González, J., Pelta, D.: Optimization in dynamic environments: a survey on problems, methods and measures. In: Soft Computing, pp. 1–22 (2010), doi:10.1007/s00500-010-0681-0
De Jong, K.: An analysis of the behavior of a class of genetic adaptive systems. PhD thesis, University of Michigan, Ann Arbor, MI, USA (1975)
Demšar, J.: Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research 7(1) (2006)
Fonseca, V.G., Fonseca, C.M.: The attainment-function approach to stochastic multiobjective optimizer assessment and comparison. In: Bartz-Beielstein, T., Chiarandini, M., Paquete, L., Preuss, M. (eds.) Experimental Methods for the Analysis of Optimization Algorithms, pp. 103–130. Springer, Heidelberg (2010), doi:10.1007/978-3-642-02538-9_5
García, S., Herrera, F.: An extension on ”statistical comparisons of classifiers over multiple data sets” for all pairwise comparisons. Journal of Machine Learning Research 9, 2677–2694 (2008)
García, S., Molina, D., Lozano, M., Herrera, F.: A study on the use of non-parametric tests for analyzing the evolutionary algorithms’ behaviour: a case study on the cec’2005 special session on real parameter optimization. Journal of Heuristics 15(6), 617–644 (2009), doi:10.1007/s10732-008-9080-4
Gräning, L., Jin, Y., Sendhoff, B.: Individual-based management of meta-models for evolutionary optimization with application to three-dimensional blade optimization. In: Yang, S., Ong, Y.-S., Jin, Y. (eds.) Evolutionary Computation in Dynamic and Uncertain Environments. SCI, vol. 51, pp. 225–250. Springer, Heidelberg (2007), doi:10.1007/978-3-540-49774-5_10
Hollander, M., Wolfe, D.: Nonparametric Statistical Methods, 2nd edn. John Wiley & Sons, Inc. (1999)
Holm, S.: A simple sequentially rejective multiple test procedure. Scandinavian Journal of Statistics 6(2), 65–70 (1979)
Kruskal, W.H., Allen Wallis, W.: Use of ranks in one-criterion variance analysis. Journal of the American Statistical Association 47(260), 583–621 (1952)
Russell, V.: Lenth. Some practical guidelines for effective sample size determination. The American Statistician 55(3), 187–193 (2001), doi:10.1198/000313001317098149
López-Ibáñez, M., Paquete, L., Stützle, T.: Exploratory analysis of stochastic local search algorithms in biobjective optimization. In: Bartz-Beielstein, T., Chiarandini, M., Paquete, L., Preuss, M. (eds.) Experimental Methods for the Analysis of Optimization Algorithms, pp. 209–222. Springer, Heidelberg (2010), doi:10.1007/978-3-642-02538-9_9
Mann, H.B., Whitney, D.R.: On a test of whether one of two random variables is stochastically larger than the other. The Annals of Mathematical Statistics 18(1), 50–60 (1947), doi:10.1214/aoms/1177730491
Randles, R.H., Wolfe, D.: Introduction to the Theory of Nonparametric Statistics. John Wiley & Sons, Inc. (1979)
Reyes-Sierra, M., Coello, C.: A study of techniques to improve the efficiency of a multi-objective particle swarm optimizer. In: Yang, S., Ong, Y.-S., Jin, Y. (eds.) Evolutionary Computation in Dynamic and Uncertain Environments. SCI, vol. 51, pp. 269–296. Springer, Heidelberg (2007), doi:10.1007/978-3-540-49774-5_12
Weicker, K.: Performance Measures for Dynamic Environments. In: Guervós, J.J.M., Adamidis, P.A., Beyer, H.-G., Fernández-Villacañas, J.-L., Schwefel, H.-P. (eds.) PPSN 2002. LNCS, vol. 2439, pp. 64–73. Springer, Heidelberg (2002), doi:10.1007/3-540-45712-7_7
Wilcoxon, F.: Individual comparisons by ranking methods. Biometrics Bulletin 1(6), 80–83 (1945), doi:10.2307/3001968
Yang, S.: Explicit memory schemes for evolutionary algorithms in dynamic environments. In: Yang, S., Ong, Y.-S., Jin, Y. (eds.) Evolutionary Computation in Dynamic and Uncertain Environments. SCI, vol. 51, pp. 3–28. Springer, Heidelberg (2007), doi:10.1007/978-3-540-49774-5_1
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
del Amo, I.G., Pelta, D.A. (2013). SRCS: A Technique for Comparing Multiple Algorithms under Several Factors in Dynamic Optimization Problems. In: Alba, E., Nakib, A., Siarry, P. (eds) Metaheuristics for Dynamic Optimization. Studies in Computational Intelligence, vol 433. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-30665-5_4
Download citation
DOI: https://doi.org/10.1007/978-3-642-30665-5_4
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-30664-8
Online ISBN: 978-3-642-30665-5
eBook Packages: EngineeringEngineering (R0)