Advertisement

Meta-heuristics, Machine Learning, and Deep Learning Methods

  • Hitoshi IbaEmail author
Chapter

Abstract

This chapter introduces several meta-heuristics and learning methods, which will be employed in later chapters. These methods will be employed to extend evolutionary computation frameworks in later chapters. Readers familiar with these methods may skip this chapter.

Keywords

Particle swarm optimization (PSO) Differential evolution (DE) k-means algorithm Support vector machine (SVM) Relevance vector machine (RVM) k-nearest neighbor classifier (k-NN) Transfer learning Bagging Boosting Gröbner bases Affinity propagation Convolutional neural networks (CNN) Generative adversary networks (GAN) Bayesian networks Loopy belief propagation 

References

  1. 1.
    Aha, D.W., Kibler, D., Albert, M.: Instance-based learning algorithms. Mach. Learn. 6, 37–66 (1991)Google Scholar
  2. 2.
    Angeline, P.J.: Evolutionary optimization versus particle swarm optimization: philosophy and performance differences. In: Porto, V.W., Saravanan, N., Waagen, D., Eiben, A.E. (eds.) Evolutionary Programming VII, pp. 601–610. Springer, Berlin (1998)Google Scholar
  3. 3.
    Bäck, T., Fogel, D.B., Michalewicz, Z. (eds.): Evolutionary Computation 1: Basic Algorithms and Operators. Institute of Physics Publishing, Bristol (2000)Google Scholar
  4. 4.
    Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, Berlin (2006)Google Scholar
  5. 5.
    Buchberger, B.: A criterion for detecting unnecessary reductions in the construction of Gröbner-bases. In: Proceedings of the International Symposium on Symbolic and Algebraic Computation (EUROSAM’79), pp. 3–21 (1979)CrossRefGoogle Scholar
  6. 6.
    Buchberger, B.: A note on the complexity of constructing Gröbner-bases. In: van Hulzen, J.A. (ed.) Computer Algebra, EUROCAL 1983. Lecture Notes in Computer Science, vol. 162. Springer, Berlin (1983)Google Scholar
  7. 7.
    Buchberger, B.: A critical pair completion algorithm for finitely generated ideals in rings. In: Borger, E., Hasenjaeger, G., Rodding, D. (eds.) Logic and Machines: Decision Problems and Complexity. Lecture Notes in Computer Science, vol. 171. Springer, Berlin (1984)CrossRefGoogle Scholar
  8. 8.
    Buchberger, B.: Gröbner bases: an algorithmic method in polynomial ideal theory. In: Bose, N.K. (ed.) Multidimensional Systems Theory. D. Reidel Publishing Company, Dordrecht (1985)CrossRefGoogle Scholar
  9. 9.
    Buchberger, B.: Applications of Gröbner bases in non-linear computational geometry. In: Jansen, R. (ed.) Trends in Computer Algebra. Lecture Notes in Computer Science, vol. 296. Springer, Berlin (1987)Google Scholar
  10. 10.
    Buchberger, B., Winkler, F.: Gröbner Bases and Applications, vol. 251. Cambridge University Press, Cambridge (1998)Google Scholar
  11. 11.
    Caviness, B.F., Johnson, J.R. (eds.): Quantifier Elimination and Cylindrical Algebraic Decomposition. Springer, Berlin (2013)Google Scholar
  12. 12.
    Chakraborti, N., Misra, K., Bhatt, P., Barman, N., Prasad, R.: Tight-binding calculations of Si-H clusters using genetic algorithms and related techniques: studies using differential evolution. J. Phase Equilibria 22(5), 525–530 (2001)CrossRefGoogle Scholar
  13. 13.
    Chang, C.-C., Lin, C.-J.: LIBSVM: a library for support vector machines. ACM Trans. Intell. Syst. Technol. 2(3), 1–27 (2011)CrossRefGoogle Scholar
  14. 14.
    Chou, S., Schelter, W.F.: Proving geometry theorems with rewrite rules. J. Autom. Reason. 2, 253–273 (1986)CrossRefzbMATHGoogle Scholar
  15. 15.
    Cortes, C., Vapnik, V.: Support-vector networks. Mach. Learn. 20(3), 273–297 (1995)zbMATHGoogle Scholar
  16. 16.
    Dasarathy, B.: Nearest Neighbor (NN) Norms: NN Pattern Classification Techniques. IEEE Computer Society Press, Los Alamitos (1991)Google Scholar
  17. 17.
    Drucker, H.: Improving Regression Using Boosting Techniques. In: Proceedings of International Conference on Machine Learning (ICML97) (1997)Google Scholar
  18. 18.
    Dueck, D., Frey, B.J.: Non-metric affinity propagation for unsupervised image categorization. In: 2007 IEEE 11th International Conference on Computer Vision, pp. 1–8. IEEE (2007)Google Scholar
  19. 19.
    Eberhart, R.C., Shi, Y.: Comparison between genetic algorithms and particle swarm optimization. In: Proceedings of the Seventh Annual Conference on Evolutionary Programming, pp. 611–619 (1998)Google Scholar
  20. 20.
    Fan, R.-E., Chang, L.-W., Hsieh, C.-J., Wang, X.-R., Lin, C.-J.: LIBLINEAR: a library for large linear classification. J. Mach. Learn. Res. 9, 1871–1874 (2008)zbMATHGoogle Scholar
  21. 21.
    Felzenszwalb, P.F., Huttenlocher, D.P.: Efficient belief propagation for early vision. Int. J. Comput. Vis. 70, 41–54 (2006)CrossRefGoogle Scholar
  22. 22.
    Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: Proceedings of International Conference on Machine Learning (ICML96) (1996)Google Scholar
  23. 23.
    Frey, B.J., Dueck, D.: Clustering by passing messages between data points. Science 315, 972–976 (2007)MathSciNetCrossRefzbMATHGoogle Scholar
  24. 24.
    Fukushima, K.: Neocognitron: a self organizing neural network model for a mechanism of pattern recognition unaffected by shift in position. Biol. Cybern. 36(4), 193–202 (1980)CrossRefzbMATHGoogle Scholar
  25. 25.
    Gämperle, R., Müller, S.D., Koumoutsakos, P.: A parameter study for differential evolution. In: Proceedings of International Conference on Advances in Intelligent Systems, Fuzzy Systems, Evolutionary Computation, pp. 293–298 (2002)Google Scholar
  26. 26.
    Goodfellow, I., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S., Bengio, Y.: Generative adversarial nets. In: Advances in Neural Information Processing Systems, pp. 2672–2680 (2014)Google Scholar
  27. 27.
    Higashi, N., Iba, H.: Particle swarm optimization with Gaussian mutation. In: Proceedings of IEEE Swarm Intelligence Symposium (SIS03), pp. 72–79 (2003)Google Scholar
  28. 28.
    Hinton, G.E., Salakhutdinov, R.R.: Reducing the dimensionality of data with neural networks. Science 313, 504–507 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  29. 29.
    Holland, J.H.: Adaptation in Natural and Artificial Systems. University of Michigan Press (1975)Google Scholar
  30. 30.
    Hu, R., Namee, B.M., Delany, S.J.: Off to a good start: using clustering to select the initial training set in active learning. In: Proceedings of the Florida Artificial Intelligence Research Society Conference (FLAIRS) (2010)Google Scholar
  31. 31.
    Huang, C.-H., Wang, C.-J.: A GA-based feature selection and parameters optimization for support vector machines. Expert Syst. Appl. 31, 231–240 (2006)CrossRefGoogle Scholar
  32. 32.
    Iba, H., Hirochika Inoue, H.: Reasoning of geometric concepts based on algebraic constraint-directed method. In: Proceedings of the IJCAI, pp. 143–151 (1991)Google Scholar
  33. 33.
    Iba, H., Noman, N.: New Frontiers in Evolutionary Algorithms: Theory and Applications. World Scientific Publishing Company, Singapore (2011)Google Scholar
  34. 34.
    Jia, Y., Wang, J., Zhang, C., Hua, X.-S.: Finding image exemplars using fast sparse affinity propagation. In: Proceedings of the 16th ACM International Conference on Multimedia, pp. 639–642. ACM (2008)Google Scholar
  35. 35.
    Kapur, D.: Geometric reasoning and artificial intelligence: introduction to the special volume. Artif. Intell. 37 (1988)CrossRefGoogle Scholar
  36. 36.
    Kennedy, J., Eberhart, R.C.: Particle swarm optimization. In: Proceedings of IEEE the International Conference on Neural Networks, pp. 1942–1948 (1995)Google Scholar
  37. 37.
    Kennedy, J., Eberhart, R.C.: Swarm Intelligence. Morgan Kaufmann Publishers, San Francisco (2001)Google Scholar
  38. 38.
    Kennedy, J., Spears, W.M.: Matching algorithms to problems: an experimental test of the particle swarm and some genetic algorithms on the multimodal problem generator. In: Proceedings of the IEEE Congress on Evolutionary Computation (CEC), pp. 78–83 (1998)Google Scholar
  39. 39.
    Krink, T., Filipič, B., Fogel, G., Thomsen, R.: Noisy optimization problems – a particular challenge for differential evolution? In: Proceedings of Congress on Evolutionary Computation, pp. 332–339 (2004)Google Scholar
  40. 40.
    Krizhevsky, A., Sutskerver, I., Hinton, G.E.: ImageNet classification with deep convolutional neural networks. In: Advances in Neural Information Processing Systems 25 (NIPS), pp. 1097–1105 (2012)Google Scholar
  41. 41.
    Larsen, A.B.L., S\(\phi \)nderby, S.K., Winther, O.: Autoencoding beyond pixels using a learned similarity metric (2015). arXiv:1512.09300
  42. 42.
    Laubenbacher, R., Stigler, B.: A computational algebra approach to the reverse engineering of gene regulatory networks. J. Theor. Biol. 229, 523–537 (2004)MathSciNetCrossRefGoogle Scholar
  43. 43.
    Le, Q., Ranzato, M., Monga, R., Devin, M., Chen, K., Corrado, G., Dean, J., Ng, A.: Building high-level features using large scale unsupervised learning. In: Proceedings of the 29th International Conference on Machine Learning (2012)Google Scholar
  44. 44.
    LeCun, Y., Bottou, L., Bengio, Y., Haffner, P.: Gradient-based learning applied to document recognition. Proc. IEEE 86(11), 2278–2324 (1998)CrossRefGoogle Scholar
  45. 45.
    Limbeck, J.: Computation of approximate border bases and applications, Ph.D. thesis, Passau, Universität Passau, Dissertation (2014)Google Scholar
  46. 46.
    Liu, Z., Li, P., Zheng, Y., Sun, M.: Clustering to find exemplar terms for key phrase extraction. In: Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing, vol. 1, pp. 257–266. Association for Computational Linguistics (2009)Google Scholar
  47. 47.
    Loos, R.: Introduction. In: Buchberger, B., et al. (eds.) Computer Algebra Symbolic and Algebraic Computation. Springer, Berlin (1982)Google Scholar
  48. 48.
    Maclin, R., Opitz, D.: An empirical evaluation of bagging and boosting. In: Proceedings of National Conference on Artificial Intelligence (AAAI97) (1997)Google Scholar
  49. 49.
    Möller, H.M., Buchberger, B.: The construction of multivariate polynomials with preassigned zeros. In: Proceedings of the European Computer Algebra Conference on Computer Algebra, pp. 24–31 (1982)Google Scholar
  50. 50.
    Mourrain, B.: A new criterion for normal form algorithms. Applied Algebra, Algebraic Algorithms and Error-correcting Codes, pp. 430–442. Springer, Berlin (1999)CrossRefzbMATHGoogle Scholar
  51. 51.
    Nguyen, A., Yosinski, J., Clune, J.: Deep neural networks are easily fooled: high confidence predictions for unrecognizable images. In: Proceedings of 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 427–436 (2015)Google Scholar
  52. 52.
    Noman, N., Iba, H.: Enhancing differential evolution performance with local search for high dimensional function optimization. In: Proceedings of Genetic and Evolutionary Computation Conference (GECCO2005), pp. 967–974 (2005)Google Scholar
  53. 53.
    Noman, N., Iba, H.: A new generation alternation model for differential evolution. In: Proceedings of Genetic and Evolutionary Computation Conference (GECCO 2006), pp. 1265–1272 (2006)Google Scholar
  54. 54.
    Noman, N., Iba, H.: Differential evolution for economic load dispatch problems. Elsevier Electric Power Syst. Res. 78(8), 1322–1331 (2008)CrossRefGoogle Scholar
  55. 55.
    Noman, N., Iba, H.: Accelerating differential evolution using an adaptive local search. IEEE Trans. Evol. Comput. 12(1), 107–125 (2008)CrossRefGoogle Scholar
  56. 56.
    Pan, S.J., Yang, Q.: A survey on transfer learning. IEEE Trans. Knowl. Data Eng. 22(10), 1345–1359 (2010)CrossRefGoogle Scholar
  57. 57.
    Paul, T.K., Ueno, K., Iwata, K., Hayashi, T., Honda, N.: Genetic algorithm based methods for identification of health risk factors aimed at preventing metabolic syndrome. In: Proceedings of the 7th International Conference on Simulated Evolution And Learning (SEAL’08). LNCS, vol. 5361, pp. 210–219. Springer, Berlin (2008)Google Scholar
  58. 58.
    Pearl, J.: Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Morgan Kaufmann, San Mateo (1988)Google Scholar
  59. 59.
    Price, K.V., Storn, R.M., Lampinen, J.A.: Differential Evolution: A Practical Approach to Global Optimization. Springer, Berlin (2005)zbMATHGoogle Scholar
  60. 60.
    Quinlan, J.R.: Bagging, Boosting, and C4.5. In: Proceedings of National Conference on Artificial Intelligence (AAAI96) (1996)Google Scholar
  61. 61.
    Radford, A., Metz, L., Chintala, S.: Unsupervised representation learning with deep convolutional generative adversarial networks (2015). arXiv:1511.06434
  62. 62.
    Sato, H., Hasegawa, Y., Bollegala, D., Iba, H.: Improved sampling using loopy belief propagation for probabilistic model building genetic programming. Swarm Evol. Comput. 23, 1–10 (2015)CrossRefGoogle Scholar
  63. 63.
    Stifter, S.: Algebraic methods for computing inverse kinematics. J. Intell. Robot. Syst. 11(1–2), 79–89 (1994)CrossRefzbMATHGoogle Scholar
  64. 64.
    Storn, R.: System design by constraint adaptation and differential evolution. IEEE Trans. Evol. Comput. 3(1), 22–34 (1999)CrossRefGoogle Scholar
  65. 65.
    Storn, R., Price, K.V.: Differential evolution – a simple and efficient adaptive scheme for global optimization over continuous spaces. Technical report TR-95-012, ICSI (1995)Google Scholar
  66. 66.
    Storn, R., Price, K.V.: Differential evolution -a simple and efficient heuristic for global optimization over continuous spaces. J. Global Optim. 11(4), 341–359 (1997)MathSciNetCrossRefzbMATHGoogle Scholar
  67. 67.
    Swan, J., Neumann, G.K., Krawiec, K.: Analysis of semantic building blocks via Grobner bases. In: Johnson, C., Krawiec, K., O’Neill, M., Moraglio, A. (eds.) Semantic Methods in Genetic Programming (SMGP) at Parallel Problem Solving from Nature (PPSN XIV), Ljubljana, Slovenia (2014)Google Scholar
  68. 68.
    Tipping, M.E.: The relevance vector machine. Advances in Neural Information Processing Systems, pp. 652–658. MIT Press, Cambridge (2000)Google Scholar
  69. 69.
    Vapnik, V.: Statistical Learning Theory. Wiley, New York (1998)Google Scholar
  70. 70.
    Yang, Y., Iba, H.: Fooling voice based on evolutionary computation. In: Proceedings of Evolutionary Computation Symposium, Dec. 9–10, Hokkaido, Japan (2017)Google Scholar
  71. 71.
    Zaharie, D.: Critical values for the control parameters of differential evolution algorithms. In: Proceedings of MENDEL 2002, 8th International Conference on Soft Computing, pp. 62–67 (2002)Google Scholar
  72. 72.
    Zha, Z.-J., Yang, L., Mei, T., Wang, M., Wang, Z.: Visual query suggestion. In: Proceedings of the 17th ACM International Conference on Multimedia, pp. 15–24. ACM (2009)Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2018

Authors and Affiliations

  1. 1.The University of TokyoTokyoJapan

Personalised recommendations