Advertisement

Automating Autoencoder Architecture Configuration: An Evolutionary Approach

  • Francisco CharteEmail author
  • Antonio J. Rivera
  • Francisco Martínez
  • María J. del Jesus
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11486)

Abstract

Learning from existing data allows building models able to classify patterns, infer association rules, predict future values in time series and much more. Choosing the right features is a vital step of the learning process, specially while dealing with high-dimensional spaces. Autoencoders (AEs) have shown ability to conduct manifold learning, compressing the original feature space without losing useful information. However, there is no optimal AE architecture for all datasets. In this paper we show how to use evolutionary approaches to automate AE architecture configuration. First, a coding to embed the AE configuration in a chromosome is proposed. Then, two evolutionary alternatives are compared against exhaustive search. The results show the great superiority of the evolutionary way.

Keywords

Deep learning Autoencoder Optimization Evolutionary 

References

  1. 1.
    Bäck, T., Schwefel, H.P.: An overview of evolutionary algorithms for parameter optimization. Evol. Comput. 1, 1–23 (1993)CrossRefGoogle Scholar
  2. 2.
    Bengio, Y., Courville, A., Vincent, P.: Representation learning: a review and new perspectives. IEEE Trans. Pattern Anal. Mach. Intell. 35(8), 1798–1828 (2013)CrossRefGoogle Scholar
  3. 3.
    Cayton, L.: Algorithms for manifold learning. Technical report, University of California at San Diego (2005)Google Scholar
  4. 4.
    Charte, D., Charte, F., García, S., Herrera, F.: A snapshot on nonstandard supervised learning problems: taxonomy, relationships, problem transformations and algorithm adaptations. Prog. Artif. Intell. 8(1), 1–14 (2018).  https://doi.org/10.1007/s13748-018-00167-7CrossRefGoogle Scholar
  5. 5.
    Charte, D., Charte, F., García, S., del Jesus, M.J., Herrera, F.: A practical tutorial on autoencoders for nonlinear feature fusion: taxonomy, models, software and guidelines. Inf. Fusion 44, 78–96 (2018)CrossRefGoogle Scholar
  6. 6.
    Charte, D., Herrera, F., Charte, F.: Ruta: implementations of neural autoencoders in R. Knowl.-Based Syst. 174, 4–8 (2019, in press).  https://doi.org/10.1016/j.knosys.2019.01.014CrossRefGoogle Scholar
  7. 7.
    Domingos, P.: A few useful things to know about machine learning. Commun. ACM 55(10), 78–87 (2012)CrossRefGoogle Scholar
  8. 8.
    Freitas, A.A.: A review of evolutionary algorithms for data mining. In: Maimon, O., Rokach, L. (eds.) Data Mining and Knowledge Discovery Handbook, pp. 371–400. Springer, Boston (2009).  https://doi.org/10.1007/978-0-387-09823-4_19CrossRefGoogle Scholar
  9. 9.
    Friedrichs, F., Igel, C.: Evolutionary tuning of multiple svm parameters. Neurocomputing 64, 107–117 (2005)CrossRefGoogle Scholar
  10. 10.
    García, S., Luengo, J., Herrera, F.: Data Preprocessing in Data Mining. Springer, Cham (2015).  https://doi.org/10.1007/978-3-319-10247-4CrossRefGoogle Scholar
  11. 11.
    Guyon, I., Elisseeff, A.: An introduction to feature extraction. In: Guyon, I., Nikravesh, M., Gunn, S., Zadeh, L.A. (eds.) Feature Extraction, pp. 1–25. Springer, Heidelberg (2006).  https://doi.org/10.1007/978-3-540-35488-8_1CrossRefGoogle Scholar
  12. 12.
    Hall, M.A.: Correlation-based feature selection for machine learning. Ph.D. thesis, University of Waikato Hamilton (1999)Google Scholar
  13. 13.
    Hotelling, H.: Analysis of a complex of statistical variables into principal components. J. Educ. Psychol. 24(6), 417 (1933)CrossRefGoogle Scholar
  14. 14.
    Lee, J.A., Verleysen, M.: Nonlinear Dimensionality Reduction. Springer, New York (2007).  https://doi.org/10.1007/978-0-387-39351-3CrossRefzbMATHGoogle Scholar
  15. 15.
    Martinez-Murcia, F.J., et al.: Deep convolutional autoencoders vs PCA in a highly-unbalanced Parkinson’s disease dataset: a DaTSCAN study. In: Graña, M., et al. (eds.) SOCO’18-CISIS’18-ICEUTE’18 2018. AISC, vol. 771, pp. 47–56. Springer, Cham (2019).  https://doi.org/10.1007/978-3-319-94120-2_5CrossRefGoogle Scholar
  16. 16.
    Peng, H., Long, F., Ding, C.H.Q.: Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 27, 1226–1238 (2005)CrossRefGoogle Scholar
  17. 17.
    Segovia, F., Górriz, J., Ramírez, J., Martinez-Murcia, F., García-Pérez, M.: Using deep neural networks along with dimensionality reduction techniques to assist the diagnosis of neurodegenerative disorders. Logic J. IGPL 26(6), 618–628 (2018)MathSciNetGoogle Scholar
  18. 18.
    Young, S.R., Rose, D.C., Karnowski, T.P., Lim, S.H., Patton, R.M.: Optimizing deep learning hyper-parameters through an evolutionary algorithm. In: Proceedings of the Workshop on Machine Learning in High-Performance Computing Environments, p. 4. ACM (2015)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Andalusian Research Institute in Data Science and Computational Intelligence, Computer Science DepartmentUniversidad de JaénJaénSpain

Personalised recommendations