Skip to main content

An Analysis of the Local Optima Storage Capacity of Hopfield Network Based Fitness Function Models

  • Chapter
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((TCCI,volume 8790))

Abstract

A Hopfield Neural Network (HNN) with a new weight update rule can be treated as a second order Estimation of Distribution Algorithm (EDA) or Fitness Function Model (FFM) for solving optimisation problems. The HNN models promising solutions and has a capacity for storing a certain number of local optima as low energy attractors. Solutions are generated by sampling the patterns stored in the attractors. The number of attractors a network can store (its capacity) has an impact on solution diversity and, consequently solution quality. This paper introduces two new HNN learning rules and presents the Hopfield EDA (HEDA), which learns weight values from samples of the fitness function. It investigates the attractor storage capacity of the HEDA and shows it to be equal to that known in the literature for a standard HNN. The relationship between HEDA capacity and linkage order is also investigated.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Goldberg, D.E.: Genetic Algorithms in Search, Optimization, and Machine Learning. Addison-Wesley Professional, Upper Saddle River (1989)

    MATH  Google Scholar 

  2. Goldberg, D.E.: Genetic algorithms and walsh functions: Part II, deception and its analysis. Complex Syst. 3, 153–171 (1989)

    MATH  Google Scholar 

  3. Holland, J.: Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence. University of Michigan Press, Ann Arbor (1975)

    Google Scholar 

  4. Pelikan, M., Goldberg, D.E., Cantú-paz, E.E.: Linkage problem, distribution estimation, and bayesian networks. Evol. Comput. 8(3), 311–340 (2000)

    Article  Google Scholar 

  5. Davidor, Y.: Epistasis variance: a viewpoint on ga-hardness. In: Rawlins, G.J.E. (ed.) Foundations of Genetic Algorithms, pp. 23–35. Morgan Kaufmann, San Mateo (1990)

    Google Scholar 

  6. Goldberg, D.E.: Genetic algorithms and walsh functions: part I, a gentle introduction. Complex Syst. 3, 129–152 (1989)

    MATH  Google Scholar 

  7. Bethke, D.: Genetic algorithms as function optimizers (1978)

    Google Scholar 

  8. McEliece, R., Posner, E., Rodemich, E., Venkatesh, S.: The capacity of the hopfield associative memory. IEEE Trans. Inf. Theory 33(4), 461–482 (1987)

    Article  MATH  MathSciNet  Google Scholar 

  9. Santana, R.: Estimation of distribution algorithms: from available implementations to potential developments. In: Proceedings of the 13th Annual Conference Companion on Genetic and Evolutionary Computation, pp. 679–686. ACM (2011)

    Google Scholar 

  10. Baluja, S., Caruana, R.: Removing the genetics from the standard genetic algorithm. In: ICML, 38–46. Morgan Kaufmann (1995)

    Google Scholar 

  11. Harik, G., Lobo, F., Goldberg, D.: The compact genetic algorithm. IEEE Trans. Evol. Comput. 3(4), 287–297 (1999)

    Article  Google Scholar 

  12. Mhlenbein, H.: The equation for response to selection and its use for prediction. Evol. Comput. 5(3), 303–346 (1997)

    Article  Google Scholar 

  13. Bonet, J.S.D., Isbell Jr., C.L., Viola, P.: Finding optima by estimating probability densities. In: Mozer, M., Jordan, M., Petsche, T. (eds.) Advances in Neural Information Processing Systems, p. 424. The MIT Press, Cambridge (1996)

    Google Scholar 

  14. Pelikan, M., Mühlenbein, H.: The bivariate marginal distribution algorithm. In: Roy, R., Furuhashi, T., Chawdhry, P.K. (eds.) Advances in Soft Computing - Engineering Design and Manufacturing, pp. 521–535. Springer, London (1999)

    Google Scholar 

  15. Pelikan, M., Goldberg, D., Cant-Paz, E.: Linkage problem, distribution estimation, and bayesian networks. Evol. Comput. 8(3), 311–340 (2000)

    Article  Google Scholar 

  16. Shakya, S., McCall, J., Brownlee, A., Owusu, G.: Deum - distribution estimation using markov networks. In: Shakya, S., Santana, R. (eds.) Markov Networks in Evolutionary Computation. Adaptation, Learning, and Optimization, vol. 14, pp. 55–71. Springer, Berlin (2012)

    Chapter  Google Scholar 

  17. Walsh, J.: A closed set of normal orthogonal functions. Am. J. Math. 45, 5–24 (1923)

    Article  MATH  Google Scholar 

  18. Hopfield, J.J.: Neural networks and physical systems with emergent collective computational abilities. Proc. Nat. Acad. Sci. USA 79(8), 2554–2558 (1982)

    Article  MathSciNet  Google Scholar 

  19. Hopfield, J.J., Tank, D.W.: Neural computation of decisions in optimization problems. Biol. Cybern. 52, 141–152 (1985)

    MATH  MathSciNet  Google Scholar 

  20. Caparrós, G.J., Ruiz, M.A.A., Hernández, F.S.: Hopfield neural networks for optimization: study of the different dynamics. Neurocomputing 43(1–4), 219–237 (2002)

    Google Scholar 

  21. Ackley, D., Hinton, G., Sejnowski, T.: A learning algorithm for Boltzmann machines. Cogn. Sci. 9(1), 147–169 (1985)

    Article  Google Scholar 

  22. Storkey, A.J., Valabregue, R.: The basins of attraction of a new hopfield learning rule. Neural Netw. 12(6), 869–876 (1999)

    Article  Google Scholar 

  23. Shakya, S., Brownlee, A., McCall, J., Fournier, F., Owusu, G.: A fully multivariate deum algorithm. In: IEEE Congress on Evolutionary Computation, CEC ’09, pp. 479–486 (2009)

    Google Scholar 

  24. Chib, S., Greenberg, E.: Understanding the metropolis-hastings algorithm. Am. Stat. 49(4), 327–335 (1995)

    Google Scholar 

  25. Kubota, T.: A higher order associative memory with Mcculloch-Pitts neurons and plastic synapses. In: International Joint Conference on Neural Networks, IJCNN 2007, pp. 1982–1989 (2007)

    Google Scholar 

  26. Swingler, K., Smith, L.S.: Mixed order associative networks for function approximation, optimisation and sampling. In: Proceedings of 21st European Symposium on Artificial Neural Networks, ESANN 2013 (2013)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kevin Swingler .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Swingler, K., Smith, L. (2014). An Analysis of the Local Optima Storage Capacity of Hopfield Network Based Fitness Function Models. In: Nguyen, N., Kowalczyk, R., Fred, A., Joaquim, F. (eds) Transactions on Computational Collective Intelligence XVII. Lecture Notes in Computer Science(), vol 8790. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-44994-3_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-662-44994-3_13

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-662-44993-6

  • Online ISBN: 978-3-662-44994-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics