Abstract
Self Organizing Migrating Genetic Algorithm (SOMGA) is a hybridized variant of Genetic Algorithm (GA) inspired by the features of Self Organizing Migrating Algorithm, presented by Deep and Dipti (IEEE Congr Evol Comput, pp 2796–2803, 2007) [1]. SOMGA extracts the features of binary coded GA and real coded SOMA in such a way that diversity of the solution space can be maintained and thoroughly exploited keeping function evaluation low. It works with very less population size and tries to achieve global optimal solution faster in less number of function evaluations. Earlier SOMGA has been used to solve problems up to 10 dimensions with population size 10 only. This chapter is brake into three sections. In first section a possibility of using SOMGA to solve large scale problem (dimension up to 200) has been analyzed with the help of 13 test problems. The reason behind extension is that SOMGA works with very small population size and to solve large scale problems (dimension 200) only 20 population size is required. On the basis of results it has been concluded that SOMGA is efficient to solve large scale global optimization problems with small population size and hence required lesser function evaluations. In second section, two real life problems from the field of engineering as an application have been solved using SOMGA. In third section, a comparison between two ways of hybridization has been analyzed. There can be two approaches to hybridize a population based technique. Either by incorporating a deterministic local search in it or by merging it with other population based technique. To see the effect of both the approaches on GA, the results of SOMGA on five test problems are compared with the results of MA (GA+ deterministic local search). Results clearly indicates that SOMGA is less expensive and effective to solve these problems.
Keywords
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Deep, K., Singh, D.: A new hybrid self organizing migrating genetic algorithm for function optimization. In: IEEE Congress on Evolutionary Computation, pp. 2796–2803 (2007)
Grefensette, J.: Lamarckian learning in multi-agent environment In: Proceedings of the Fourth International Conference on Genetic Algorithms, San Mateo, CA, Morgan Kauffman (1994)
Kasprzyk, G.P., Jasku, M.: Application of hybrid genetic algorithms for deconvulation of electrochemical responses in SSLSV method. J. Electroanal. Chem. 567, 39–66 (2004)
Chelouah, R., Siarry, P.: A hybrid method combining continuous Tabu search and Nelder–Mead simplex algorithm for global optimization of multiminima functions. Eur. J. Oper. Res. 161, 636–654 (2005)
Wang, L., Tang, F., Wu, H.: Hybrid genetic algorithm based on quantum computing for numerical optimization and parameter estimation. Appl. Math. Comput. 171, 1141–1156 (2005)
Javadi, A., Farmani, A.R., Tan, T.P.: A hybrid intelligent genetic algorithm. Adv. Eng. Inf. 19, 255–262 (2005)
Fan, S.K.S., Liang, Y.C., Zahara, E.: a genetic algorithm and a particle swarm optimizer hybridized with Nelder–Mead simplex search. Comput. Ind. Eng. 50, 401–425 (2006)
Hwang, F.S., Song, H.R.: A hybrid real parameter genetic algorithm for function optimization. Adv. Eng. Inf. 20, 7–21 (2006)
Zhang, G., Lu, H.: Hybrid real coded genetic algorithm with quasi-simplex technique. Int. J. Comput. Sci. Netw. Secur. 6(10), 246–255 (2006)
Wei, L., Zhao, M.: A Nitche hybrid genetic algorithm for global optimization of continuous multi modal functions. Appl. Math. Comput. 160, 649–661 (2005)
Premalatha, K., Nataranjan, A.M.: Hybrid PSO and GA for global optimization. Int. J. Open Probl. Comput. Math. 2 (2009)
Khosravi, A., Lari, A., Addeh, J.: A new hybrid of evolutionary and conventional optimization algorithm. Appl. Math. Sci. 6, 815–825 (2012)
Ghatei, S., et al.: A new hybrid algorithm for optimization using PSO and GDA. J. Basic Appl. Sci. Res. 2, 2336–2341 (2012)
Esmin, A., Matwin, S.: A hybrid particle swarm optimization algorithm with genetic mutation. Int. J. Innovative Comput. Inf. Control 9, 1919–1934 (2013)
Zelinka I., Lampinen, J.: SOMA-self organizing migrating algorithm. In: Mendal, 6th International Conference on Soft Computing, Brno, Czech Republic, vol. 80, issue-2, p. 214 (2000)
Oplatkova, Z., Zelinka, I.: Investigation on Shannon-Kotelnik theorem impact on SOMA algorithm performance. In: Proceedings 19th European Conference on Modelling and Simulation Yuri Merkuryev, Richard Zobel (2005)
Zelinka, I.: Analytic programming by means of soma algorithm. In: Proceeding of 8th International Conference on Soft Computing Mendel ’02, Brno, Czech Republic, pp. 93–101 (2002). ISBN 80-214-2135-5
Nolle, L., Zelinka, I.: SOMA applied to optimum work roll profile selection in the hot rolling of wide steel. In: Proceedings of the 17th European Simulation Multiconference ESM 2003, Nottingham, UK, pp. 53–58 (2003). ISBN 3-936150-25-7, 9-11
Nolle, L., Zelinka, I., Hopgood, A.A., Goodyear, A.: Comparision of an self organizing migration algorithm with simulated annealing and differential evolution for automated waveform tuning. Adv. Eng. Softw. 36, 645–653 (2005)
Nolle, L.: SASS applied to optimum work roll profile selection in the hot rolling of wide steel. Knowl. Based Syst. 20(2), 203–208 (2007)
Zelinka, I., Lampinen, J., Nolle, L.: On the theoretical proof of convergence for a class of SOMA search algorithms. In: Proceedings of the 7th International MENDEL Conference on Soft Computing, Brno, CZ, pp. 103–110, 6–8 June 2001. ISBN 80-214-1894-X
Zelinka, I., Oplatkova, Z., Nolle, L.: Boolean symmetry function synthesis by means of arbitrary evolutionary algorithms—comparative study. In: Proceedings of the 18th European Simulation Multiconference ESM 2004, Magdeburg, Germany, pp. 143–148, June 2004. ISBN 3-936150-35-4, 13-14
Onwubolu, C.G., Babu, B.V.: New Optimization Techniques in Engineering. Springer, Heidelberg (2004). ISBN 3-540-20167-X
Prasad, B.N., Saini, J.S.: Optimal thermo hydraulic performance of artificially roughened solar air heaters. J. Solar Energy 47, 91–96 (1991)
Pant, M.: Genetic Algorithms for Global Optimization and their Applications. Ph.D. thesis, Department of Mathematics, IIT Roorkee, Formerly University of Roorkee (2003)
Tsutsui, S., Fujimoto, Y.: Phenotypic forking genetic algorithm (p-fGA). In: IEEE International Conference on Evolutionary Computing (ICEC ’95), Vol. 2, pp. 556–572 (1995)
Bazaraa, M.S., Sherali, H.D., Shetty, C.M.: Nonlinear Programming Theory and Algorithms. Wiley, New York (1993)
Ali, M.M., Khompatraporn, C., Zabinasky, Z.: A numerical evaluation of several global optimization algorithms on selected benchmark test problems. J. Global Optim. 31, 635–672 (2005)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Appendix
Appendix
This Appendix contains the list of 13 benchmark test problems taken from literature, which are used to evaluate the performance of the algorithm. These problems are unconstrained nonlinear optimization problems having a number of local as well as global optimal solutions. All the problems have varying difficulty level and contain unimodal as well as multi modal problems.
Problem 1: (Cosine Mixture Problem)
This problem is Cosine Mixture Function. The global optimum of this function is at (0, 0,…, 0) with fmin = −0.1n. where n is the dimension of the problem. The functional form is as follows:
Problem 2: (Exponential Problem)
This problem is Exponential Function. The global optimum of this function is at (0, 0,…, 0) with fmin = −1. The functional form is as follows:
Problem 3: (Ackley Function)
This problem is the Ackley function. The surface of the Ackley function has numerous local minima due to its exponential terms. Any search algorithm based on the gradient information will be trapped in local optima, but any search strategy that analyzes a wider region will be able to cross the valley among the optima and achieve better results. Its global minimum is at (0, 0,…, 0) with fmin = 0. The functional form is as follows:
Problem 4: (Sphere Function Problem)
The next problem is Sphere Function. This problem is continuous convex and unimodal. Its global minimum is at (0, 0,…, 0) with fmin = 0. The functional form is as follows:
Problem 5: (Griewank Function)
This problem is a widely employed test function for global optimization, the Griewank function. While this function has an exponentially increasing number of local minima as its dimension increases, it turns out that a simple Multistart algorithm is able to detect its global minimum more and more easily as the dimension increases. The optima of this function are regularly distributed. Number of local minima for arbitrary n is unknown, but in two dimensional case there are some 500 local minima. Its global minimum is at (0, 0,…, 0) with fmin = 0. The functional form is as follows:
Problem 6: (Axis Parallel Hyper Ellipsoid)
This problem is Axis Parallel Hyper Ellipsoid Function. This test problem is similar to sphere problem function. It is also known as the weighted sphere model. It is continuous convex and unimodal. Its global minimum is at (0, 0,…, 0) with fmin = 0. The functional form is as follows:
Problem 7: (Schwefel’s Double Sum)
This problem is Schwefel’s double sum Function. This function is an extension of axis parallel hyper ellipsoid function. It produces a rotated hype-ellipsoid. It is continuous convex and unimodal. Its global minimum is at (0, 0,…, 0) with fmin = 0. The functional form is as follows:
Problem 8: (Rastrigin Function)
This problem is the Rastrigin Function. It is the extended form of the sphere function with a modulator term α · cos(2πxi). This function consists of a large number of local minima (not exactly known) whose value increases with the distance to the global minimum. Its global minimum is at (0, 0,…, 0) with fmin = 0. The functional form is as follows:
Problem 9: (Rosenbrock Function)
This problem is the Rosenbrock function, also known as the banana function. It is a continuous, differentiable, unimodal and non separable function. Its difficulty arises due to nonlinear interaction between parameters. The global optimum is inside a long narrow parabolic shaped flat valley. Its global minimum is at (1, 1,…, 1) with fmin = 0. The functional form is as follows:
Problem 10: (Schwefel Function)
This problem is Schwefel Function. The contour of this function is made up of a great number of peaks and valleys. This function has a second best minimum far from the global minimum, so it is difficult for many algorithms to locate the global optimum of this function. Its global minimum is at (1, 1,…, 1) with fmin = 0. The functional form is as follows:
Problem 11: (Zakharov’s Problem)
This problem is Zakharov Function. Its global minimum is at (0, 0,…, 0) with fmin = 0. The functional form is as follows:
Problem 12: (Ellipsoidal Function)
This problem is Ellipsoidal Function. Its global minimum is at (1, 2,…, n) with fmin = 0. The functional form is as follows:
Problem 13: (Schwefel Problem 4)
This problem is Schwefel Problem 4 Function. Its global minimum is at (0, 0,…, 0) with fmin = 0. The functional form is as follows:
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Singh, D., Deep, K. (2016). SOMGA for Large Scale Function Optimization and Its Application. In: Davendra, D., Zelinka, I. (eds) Self-Organizing Migrating Algorithm. Studies in Computational Intelligence, vol 626. Springer, Cham. https://doi.org/10.1007/978-3-319-28161-2_9
Download citation
DOI: https://doi.org/10.1007/978-3-319-28161-2_9
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-28159-9
Online ISBN: 978-3-319-28161-2
eBook Packages: EngineeringEngineering (R0)