Skip to main content

Neural Random Access Machines Optimized by Differential Evolution

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11298))

Abstract

Recently a research trend of learning algorithms by means of deep learning techniques has started. Most of these are different implementations of the controller-interface abstraction: they use a neural controller as a “processor" and provide different interfaces for input, output and memory management. In this trend, we consider of particular interest the Neural Random-Access Machines, called NRAM, because this model is also able to solve problems which require indirect memory references. In this paper we propose a version of the Neural Random-Access Machines, where the core neural controller is trained with Differential Evolution meta-heuristic instead of the usual backpropagation algorithm. Some experimental results showing that this approach is effective and competitive are also presented.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    https://github.com/Gabriele91/DENN-NRAM-RESULTS-2018.

References

  1. Baioletti, M., Di Bari, G., Poggioni, V., Tracolli, M.: Can differential evolution be an efficient engine to optimize neural networks? In: Nicosia, G., Pardalos, P., Giuffrida, G., Umeton, R. (eds.) MOD 2017. LNCS, vol. 10710, pp. 401–413. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-72926-8_33

    Chapter  Google Scholar 

  2. Das, S., Mullick, S.S., Suganthan, P.N.: Recent advances in differential evolution an updated survey. Swarm Evol. Comput. 27, 1–30 (2016)

    Article  Google Scholar 

  3. Swagatam Das and Ponnuthurai Nagaratnam Suganthan: Differential evolution: a survey of the state-of-the-art. IEEE Trans. Evol. Comput. 15(1), 4–31 (2011)

    Article  Google Scholar 

  4. Di Bari, G., Poggioni, V., Baioletti, M., Tracolli, M.: Differential evolution for learning large neural networks. Technical report (2018). https://github.com/Gabriele91/DENN-RESULTS-2018

  5. Graves, A., Wayne, G., Danihelka, I.: Neural turing machines. CoRR abs/1410.5401 (2014)

    Google Scholar 

  6. Graves, A., et al.: Hybrid computing using a neural network with dynamic external memory. Nature 538(7626), 471–476 (2016)

    Article  Google Scholar 

  7. Greve, R.B., Jacobsen, E.J., Risi, S.: Evolving neural turing machines for reward-based learning. In: Proceedings of the Genetic and Evolutionary Computation Conference 2016, GECCO 2016, pp. 117–124. ACM, New York (2016)

    Google Scholar 

  8. Joulin, A., Mikolov, T.: Inferring algorithmic patterns with stack-augmented recurrent nets. In: Proceedings of the GECCO 2016, pp. 190–198 (2015)

    Google Scholar 

  9. Kurach, K., Andrychowicz, M., Sutskever, I.: Neural random-access machines. CoRR abs/1511.06392 (2015)

    Google Scholar 

  10. Morse, G., Stanley, K.O.: Simple evolutionary optimization can rival stochastic gradient descent in neural networks. In: Proceedings of the GECCO 2016, pp. 477–484 (2016)

    Google Scholar 

  11. Zaremba, W., Mikolov, T., Joulin, A., Fergus, R.: Learning simple algorithms from examples. In: Proceedings of the 33rd International Conference on International Conference on Machine Learning, ICML 2016, vol. 48, pp. 421–429. JMLR.org (2016)

    Google Scholar 

  12. Zaremba, W., Sutskever, I.: Reinforcement learning neural turing machines. CoRR abs/1505.00521 (2015)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Valentina Poggioni .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Baioletti, M., Belli, V., Di Bari, G., Poggioni, V. (2018). Neural Random Access Machines Optimized by Differential Evolution. In: Ghidini, C., Magnini, B., Passerini, A., Traverso, P. (eds) AI*IA 2018 – Advances in Artificial Intelligence. AI*IA 2018. Lecture Notes in Computer Science(), vol 11298. Springer, Cham. https://doi.org/10.1007/978-3-030-03840-3_23

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-03840-3_23

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-03839-7

  • Online ISBN: 978-3-030-03840-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics