Abstract
A Hopfield Neural Network (HNN) with a new weight update rule can be treated as a second order Estimation of Distribution Algorithm (EDA) or Fitness Function Model (FFM) for solving optimisation problems. The HNN models promising solutions and has a capacity for storing a certain number of local optima as low energy attractors. Solutions are generated by sampling the patterns stored in the attractors. The number of attractors a network can store (its capacity) has an impact on solution diversity and, consequently solution quality. This paper introduces two new HNN learning rules and presents the Hopfield EDA (HEDA), which learns weight values from samples of the fitness function. It investigates the attractor storage capacity of the HEDA and shows it to be equal to that known in the literature for a standard HNN. The relationship between HEDA capacity and linkage order is also investigated.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Goldberg, D.E.: Genetic Algorithms in Search, Optimization, and Machine Learning. Addison-Wesley Professional, Upper Saddle River (1989)
Goldberg, D.E.: Genetic algorithms and walsh functions: Part II, deception and its analysis. Complex Syst. 3, 153–171 (1989)
Holland, J.: Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence. University of Michigan Press, Ann Arbor (1975)
Pelikan, M., Goldberg, D.E., Cantú-paz, E.E.: Linkage problem, distribution estimation, and bayesian networks. Evol. Comput. 8(3), 311–340 (2000)
Davidor, Y.: Epistasis variance: a viewpoint on ga-hardness. In: Rawlins, G.J.E. (ed.) Foundations of Genetic Algorithms, pp. 23–35. Morgan Kaufmann, San Mateo (1990)
Goldberg, D.E.: Genetic algorithms and walsh functions: part I, a gentle introduction. Complex Syst. 3, 129–152 (1989)
Bethke, D.: Genetic algorithms as function optimizers (1978)
McEliece, R., Posner, E., Rodemich, E., Venkatesh, S.: The capacity of the hopfield associative memory. IEEE Trans. Inf. Theory 33(4), 461–482 (1987)
Santana, R.: Estimation of distribution algorithms: from available implementations to potential developments. In: Proceedings of the 13th Annual Conference Companion on Genetic and Evolutionary Computation, pp. 679–686. ACM (2011)
Baluja, S., Caruana, R.: Removing the genetics from the standard genetic algorithm. In: ICML, 38–46. Morgan Kaufmann (1995)
Harik, G., Lobo, F., Goldberg, D.: The compact genetic algorithm. IEEE Trans. Evol. Comput. 3(4), 287–297 (1999)
Mhlenbein, H.: The equation for response to selection and its use for prediction. Evol. Comput. 5(3), 303–346 (1997)
Bonet, J.S.D., Isbell Jr., C.L., Viola, P.: Finding optima by estimating probability densities. In: Mozer, M., Jordan, M., Petsche, T. (eds.) Advances in Neural Information Processing Systems, p. 424. The MIT Press, Cambridge (1996)
Pelikan, M., Mühlenbein, H.: The bivariate marginal distribution algorithm. In: Roy, R., Furuhashi, T., Chawdhry, P.K. (eds.) Advances in Soft Computing - Engineering Design and Manufacturing, pp. 521–535. Springer, London (1999)
Pelikan, M., Goldberg, D., Cant-Paz, E.: Linkage problem, distribution estimation, and bayesian networks. Evol. Comput. 8(3), 311–340 (2000)
Shakya, S., McCall, J., Brownlee, A., Owusu, G.: Deum - distribution estimation using markov networks. In: Shakya, S., Santana, R. (eds.) Markov Networks in Evolutionary Computation. Adaptation, Learning, and Optimization, vol. 14, pp. 55–71. Springer, Berlin (2012)
Walsh, J.: A closed set of normal orthogonal functions. Am. J. Math. 45, 5–24 (1923)
Hopfield, J.J.: Neural networks and physical systems with emergent collective computational abilities. Proc. Nat. Acad. Sci. USA 79(8), 2554–2558 (1982)
Hopfield, J.J., Tank, D.W.: Neural computation of decisions in optimization problems. Biol. Cybern. 52, 141–152 (1985)
Caparrós, G.J., Ruiz, M.A.A., Hernández, F.S.: Hopfield neural networks for optimization: study of the different dynamics. Neurocomputing 43(1–4), 219–237 (2002)
Ackley, D., Hinton, G., Sejnowski, T.: A learning algorithm for Boltzmann machines. Cogn. Sci. 9(1), 147–169 (1985)
Storkey, A.J., Valabregue, R.: The basins of attraction of a new hopfield learning rule. Neural Netw. 12(6), 869–876 (1999)
Shakya, S., Brownlee, A., McCall, J., Fournier, F., Owusu, G.: A fully multivariate deum algorithm. In: IEEE Congress on Evolutionary Computation, CEC ’09, pp. 479–486 (2009)
Chib, S., Greenberg, E.: Understanding the metropolis-hastings algorithm. Am. Stat. 49(4), 327–335 (1995)
Kubota, T.: A higher order associative memory with Mcculloch-Pitts neurons and plastic synapses. In: International Joint Conference on Neural Networks, IJCNN 2007, pp. 1982–1989 (2007)
Swingler, K., Smith, L.S.: Mixed order associative networks for function approximation, optimisation and sampling. In: Proceedings of 21st European Symposium on Artificial Neural Networks, ESANN 2013 (2013)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Swingler, K., Smith, L. (2014). An Analysis of the Local Optima Storage Capacity of Hopfield Network Based Fitness Function Models. In: Nguyen, N., Kowalczyk, R., Fred, A., Joaquim, F. (eds) Transactions on Computational Collective Intelligence XVII. Lecture Notes in Computer Science(), vol 8790. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-44994-3_13
Download citation
DOI: https://doi.org/10.1007/978-3-662-44994-3_13
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-662-44993-6
Online ISBN: 978-3-662-44994-3
eBook Packages: Computer ScienceComputer Science (R0)