Advertisement

Multi-objectivization and Surrogate Modelling for Neural Network Hyper-parameters Tuning

  • Martin Pilát
  • Roman Neruda
Part of the Communications in Computer and Information Science book series (CCIS, volume 375)

Abstract

We present a multi-objectivization approach to the parameter tuning of RBF networks and multilayer perceptrons. The approach works by adding two new objectives – maximization of kappa statistic and minimization of root mean square error – to the originally single-objective problem of minimizing the classification error of the model. We show the performance of the multi-objectivization approach on five data sets and compare it to a surrogate based single-objective algorithm for the same problem. Moreover, we compare the multi-objectivization approach to two surrogate based approaches – a single-objective one and a multi-objective one.

Keywords

Multi-objective optimization parameter tuning neural networks surrogate modelling multi-objectivization 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Asuncion, D.N.A.: UCI machine learning repository (2007)Google Scholar
  2. 2.
    Bartz-Beielstein, T., Lasarczyk, C., Preuss, M.: Sequential parameter optimization. In: Congress on Evolutionary Computation, pp. 773–780. IEEE (2005)Google Scholar
  3. 3.
    Bergstra, J., Bardenet, R., Bengio, Y., Kegl, B.: Algorithms for hyper-parameter optimization. In: NIPS 2011, pp. 2546–2554 (2011)Google Scholar
  4. 4.
    Brockhoff, D., Friedrich, T., Hebbinghaus, N., Klein, C., Neumann, F., Zitzler, E.: On the effects of adding objectives to plateau functions. Trans. Evol. Comp. 13(3), 591–603 (2009)CrossRefGoogle Scholar
  5. 5.
    Chapelle, O., Vapnik, V., Bousquet, O., Mukherjee, S.: Choosing multiple parameters for support vector machines. Mach. Learn. 46(1-3), 131–159 (2002)zbMATHCrossRefGoogle Scholar
  6. 6.
    Haykin, S.: Neural Networks: A Comprehensive Foundation, 2nd edn. Prentice Hall (July 1998)Google Scholar
  7. 7.
    Jin, Y.: Multi-objective machine learning. SCI, vol. 16. Springer, Heidelberg (2006)zbMATHCrossRefGoogle Scholar
  8. 8.
    Kapp, M.N., Sabourin, R., Maupin, P.: A PSO-based framework for dynamic SVM model selection. In: GECCO 2009, pp. 1227–1234. ACM, New York (2009)Google Scholar
  9. 9.
    Konen, W., Koch, P., Flasch, O., Bartz-Beielstein, T., Friese, M., Naujoks, B.: Tuned data mining: a benchmark study on different tuners. In: Proceedings of GECCO 2011, pp. 1995–2002. ACM, New York (2011)Google Scholar
  10. 10.
    Pilát, M., Neruda, R.: Meta-learning and model selection in multi-objective evolutionary algorithms. In: ICMLA (1), pp. 433–438. IEEE (2012)Google Scholar
  11. 11.
    Pilát, M., Neruda, R.: A surrogate based multiobjective evolution strategy with different models for local search and pre-selection. In: ICTAI 2012, pp. 1–8. IEEE (2012)Google Scholar
  12. 12.
    Pilát, M., Neruda, R.: Multiobjectivization for classiffer parameter tuning. In: GECCO 2013 (Companion), pp. 1–2. ACM (accepted 2013)Google Scholar
  13. 13.
    Reif, M., Shafait, F., Dengel, A.: Meta-learning for evolutionary parameter optimization of classiffers. Mach. Learn. 87(3), 357–380 (2012)MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Martin Pilát
    • 1
  • Roman Neruda
    • 2
  1. 1.Faculty of Mathematics and PhysicsCharles University in PraguePragueCzech Republic
  2. 2.Institute of Computer ScienceAcademy of Sciences of the Czech RepublicPrahaCzech Republic

Personalised recommendations