Hopfield Networks, Simulated Annealing, and Chaotic Neural Networks

  • Ke-Lin DuEmail author
  • M. N. S. Swamy


Hopfield model is the most popular dynamic model. Simulated annealing, inspired by annealing in metallurgy, is a metaheuristic to approximate global optimization in a large search space. The annealing concept is widely used in the training of recurrent neural networks. Chaotic neural networks are recurrent neural networks introduced with chaotic dynamics. The cellular network is a generalization of the Hopfield network to a two- or higher dimensional array of cells. This chapter is dedicated to these topics. They are widely used for solving combinatorial optimization problems.


  1. 1.
    Aarts, E., & Korst, J. (1989). Simulated annealing and Boltzmann machines. Chichester: John Wiley.zbMATHGoogle Scholar
  2. 2.
    Abe, S., Kawakami, J., & Hirasawa, K. (1992). Solving inequality constrained combinatorial optimization problems by the Hopfield neural networks. Neural Networks, 5, 663–670.CrossRefGoogle Scholar
  3. 3.
    Adachi, M., & Aihara, K. (1997). Associative dynamics in a chaotic neural network. Neural Networks, 10(1), 83–98.CrossRefGoogle Scholar
  4. 4.
    Aihara, K., Takabe, T., & Toyoda, M. (1990). Chaotic neural networks. Physics Letters A, 144, 333–340.MathSciNetCrossRefGoogle Scholar
  5. 5.
    Aomori, H., Otake, T., Takahashi, N., & Tanaka, M. (2008). Sigma-delta cellular neural network for 2D modulation. Neural Networks, 21, 349–357.CrossRefGoogle Scholar
  6. 6.
    Azencott, R. (1992). Simulated annealing: Parallelization techniques. New York: Wiley.zbMATHGoogle Scholar
  7. 7.
    Bruck, J. (1990). On the convergence properties of the Hopfield model. Proceedings of the IEEE, 78(10), 1579–1585.CrossRefGoogle Scholar
  8. 8.
    Chakraborty, K., Mehrotra, K. G., Mohan, C. K., & Ranka, S. (1992). An optimization network for solving a set of simultaneous linear equations. In Proceedings of the International Joint Conference on Neural Networks (Vol. 2, pp. 516–521). Baltimore, MD.Google Scholar
  9. 9.
    Chen, L., & Aihara, K. (1995). Chaotic simulated annealing by a neural-network model with transient chaos. Neural Networks, 8(6), 915–930.CrossRefGoogle Scholar
  10. 10.
    Chen, L., & Aihara, K. (1997). Chaos and asymptotical stability in discrete-time neural networks. Physica D, 104, 286–325.MathSciNetzbMATHCrossRefGoogle Scholar
  11. 11.
    Chen, L., & Aihara, K. (1999). Global searching ability of chaotic neural networks. IEEE Transactions on Circuits and Systems I, 48(8), 974–993.Google Scholar
  12. 12.
    Chiueh, T. D., & Goodman, R. M. (1991). Recurrent correlation associative memories. IEEE Transactions on Neural Networks, 2(2), 275–284.CrossRefGoogle Scholar
  13. 13.
    Chua, L. O., & Yang, L. (1988). Cellular neural network—Part I: Theory; Part II: Applications. IEEE Transactions on Circuits and Systems, 35, 1257–1290.MathSciNetzbMATHCrossRefGoogle Scholar
  14. 14.
    Chua, L. O., & Roska, T. (2002). Cellular neural network and visual computing—Foundation and applications. Cambridge, UK: Cambridge University Press.CrossRefGoogle Scholar
  15. 15.
    Cichocki, A., & Unbehauen, R. (1993). Neural networks for optimization and signal processing. New York: Wiley.zbMATHGoogle Scholar
  16. 16.
    Cohen, M. A., & Grossberg, S. (1983). Absolute stability of global pattern formation and parallel memory storage by competitive neural networks. IEEE Transactions on Systems, Man, and Cybernetics, 13, 815–826.Google Scholar
  17. 17.
    Culhane, A. D., Peckerar, M. C., & Marrian, C. R. K. (1989). A neural net approach to discrete Hartley and Fourier transforms. IEEE Transactions on Circuits and Systems, 36(5), 695–702.MathSciNetCrossRefGoogle Scholar
  18. 18.
    Czech, Z. J. (2001). Three parallel algorithms for simulated annealing. In: R. Wyrzykowski, J. Dongarra, M. Paprzycki, & J. Waniewski (Eds.), Proceedings of the 4th International Conference on Parallel Processing and Applied Mathematics, LNCS (Vol. 2328, pp. 210–217). Naczow, Poland; London: Springer.Google Scholar
  19. 19.
    Dang, C., & Xu, L. (2001). A Lagrange multiplier and Hopfield-type barrier function method for the traveling salesman problem. Neural Computation, 14, 303–324.zbMATHCrossRefGoogle Scholar
  20. 20.
    Demirciler, K., & Ortega, A. (2005). Reduced-complexity deterministic annealing for vector quantizer design. EURASIP Journal on Applied Signal Processing, 2005(12), 1807–1820.Google Scholar
  21. 21.
    Fleisher, M. (1988). The Hopfield model with multi-level neurons. In D. Z. Anderson (Ed.), Neural information processing systems (pp. 278–289). New York: American Institute Physics.Google Scholar
  22. 22.
    Gao, K., Ahmad, M. O., & Swamy, M. N. S. (1990). A neural network least-square estimator. In Proceedings of the International Joint Conference on Neural Networks (Vol. 3, pp. 805–810). Washington, DC.Google Scholar
  23. 23.
    Gao, X.-B., & Liao, L.-Z. (2006). A novel neural network for a class of convex quadratic minimax problems. Neural Computation, 18, 1818–1846.MathSciNetzbMATHCrossRefGoogle Scholar
  24. 24.
    Geman, S., & Geman, D. (1984). Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Transactions on Pattern Analysis and Machine Intelligence, 6, 721–741.Google Scholar
  25. 25.
    Hajek, B. (1988). Cooling schedules for optimal annealing. Mathematical Operations Research, 13(2), 311–329.MathSciNetzbMATHCrossRefGoogle Scholar
  26. 26.
    He, Y. (2002). Chaotic simulated annealing with decaying chaotic noise. IEEE Transactions on Neural Networks, 13(6), 1526–1531.CrossRefGoogle Scholar
  27. 27.
    Hopfield, J. J. (1982). Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences of the United States of America, 79, 2554–2558.MathSciNetzbMATHCrossRefGoogle Scholar
  28. 28.
    Hopfield, J. J. (1984). Neurons with graded response have collective computational properties like those of two-state neurons. Proceedings of the National Academy of Sciences of the United States of America, 81, 3088–3092.zbMATHCrossRefGoogle Scholar
  29. 29.
    Hopfield, J. J., & Tank, D. W. (1985). Neural computation of decisions in optimization problems. Biological Cybernetics, 52, 141–152.MathSciNetzbMATHGoogle Scholar
  30. 30.
    Ingber, L. (1989). Very fast simulated re-annealing. Mathematical and Computer Modelling, 12(8), 967–973.MathSciNetzbMATHCrossRefGoogle Scholar
  31. 31.
    Jang, J. S., Lee, S. Y., & Shin, S. Y. (1988). An optimization network for matrix inversion. In D. Z. Anderson (Ed.), Neural information processing systems (pp. 397–401). New York: American Institute Physics.Google Scholar
  32. 32.
    Jankowski, S., Lozowski, A., & Zurada, J. M. (1996). Complex-valued multi-state neural associative memory. IEEE Transactions on Neural Networks, 7(6), 1491–1496.Google Scholar
  33. 33.
    Kennedy, M. P., & Chua, L. O. (1988). Neural networks for nonlinear programming. IEEE Transactions on Circuits and Systems, 35, 554–562.MathSciNetCrossRefGoogle Scholar
  34. 34.
    Kirkpatrick, S., Gelatt, C. D, Jr., & Vecchi, M. P. (1983). Optimization by simulated annealing. Science, 220, 671–680.MathSciNetzbMATHCrossRefGoogle Scholar
  35. 35.
    Kwok, T., & Smith, K. A. (1999). A unified framework for chaotic neural-network approaches to combinatorial optimization. IEEE Transactions on Neural Networks, 10(4), 978–981.CrossRefGoogle Scholar
  36. 36.
    Lee, B. W., & Shen, B. J. (1992). Design and analysis of analog VLSI neural networks. In B. Kosko (Ed.), Neural networks for signal processing (pp. 229–284). Englewood Cliffs, NJ: Prentice-Hall.Google Scholar
  37. 37.
    Lee, R. S. T. (2006). Lee-Associator: A chaotic auto-associative network for progressive memory recalling. Neural Networks, 19, 644–666.zbMATHCrossRefGoogle Scholar
  38. 38.
    Le Gall, A., & Zissimopoulos, V. (1999). Extended Hopfield models for combinatorial optimization. IEEE Transactions on Neural Networks, 10(1), 72–80.CrossRefGoogle Scholar
  39. 39.
    Lendaris, G. G., Mathia, K., & Saeks, R. (1999). Linear Hopfield networks and constrained optimization. IEEE Transactions on Systems, Man, and Cybernetics, Part B, 29(1), 114–118.CrossRefGoogle Scholar
  40. 40.
    Li, J. H., Michel, A. N., & Parod, W. (1989). Analysis and synthesis of a class of neural networks: Linear systems operating on a closed hypercube. IEEE Transactions on Circuits and Systems, 36(11), 1405–1422.Google Scholar
  41. 41.
    Little, W. A. (1974). The existence of persistent states in the brain. Mathematical Biosciences, 19, 101–120.zbMATHCrossRefGoogle Scholar
  42. 42.
    Liu, Y., & You, Z. (2008). Stability analysis for the generalized Hopfield neural networks with multi-level activation functions. Neurocomputing, 71, 3595–3601.Google Scholar
  43. 43.
    Locatelli, M. (2001). Convergence and first hitting time of simulated annealing algorithms for continuous global optimization. Mathematical Methods of Operations Research, 54, 171–199.MathSciNetzbMATHCrossRefGoogle Scholar
  44. 44.
    Matsuda, S. (1998). “Optimal” Hopfield network for combinatorial optimization with linear cost function. IEEE Transactions on Neural Networks, 9(6), 1319–1330.CrossRefGoogle Scholar
  45. 45.
    Metropolis, N., Rosenbluth, A., Rosenbluth, M., Teller, A., & Teller, E. (1953). Equations of state calculations by fast computing machines. Journal of Chemical Physics, 21(6), 1087–1092.Google Scholar
  46. 46.
    Muezzinoglu, M. K., Guzelis, C., & Zurada, J. M. (2003). A new design method for the complex-valued multistate Hopfield associative memory. IEEE Transactions on Neural Networks, 14(4), 891–899.Google Scholar
  47. 47.
    Nam, D. K., & Park, C. H. (2000). Multiobjective simulated annealing: A comparative study to evolutionary algorithms. International Journal of Fuzzy Systems, 2(2), 87–97.Google Scholar
  48. 48.
    Nozawa, H. (1992). A neural network model as a globally coupled map and applications based on chaos. Chaos, 2(3), 377–386.MathSciNetzbMATHCrossRefGoogle Scholar
  49. 49.
    Richardt, J., Karl, F., & Muller, C. (1998). Connections between fuzzy theory, simulated annealing, and convex duality. Fuzzy Sets and Systems, 96, 307–334.Google Scholar
  50. 50.
    Rose, K., Gurewitz, E., & Fox, G. C. (1990). A deterministic annealing approach to clustering. Pattern Recognition Letters, 11(9), 589–594.Google Scholar
  51. 51.
    Rose, K. (1998). Deterministic annealing for clustering, compression, classification, regression, and related optimization problems. Proceedings of the IEEE, 86(11), 2210–2239.CrossRefGoogle Scholar
  52. 52.
    Roska, T., & Chua, L. O. (1993). The CNN universal machine: An analogic array computer. IEEE Transactions on Circuits and Systems II, 40(3), 163–173.zbMATHCrossRefGoogle Scholar
  53. 53.
    Si, J., & Michel, A. N. (1995). Analysis and synthesis of a class of discrete-time neural networks with multilevel threshold neurons. IEEE Transactions on Neural Networks, 6(1), 105–116.Google Scholar
  54. 54.
    Sima, J., Orponen, P., & Antti-Poika, T. (2000). On the computational complexity of binary and analog symmetric Hopfield nets. Neural Computation, 12, 2965–2989.CrossRefGoogle Scholar
  55. 55.
    Sima, J., & Orponen, P. (2003). Continuous-time symmetric Hopfield nets are computationally universal. Neural Computation, 15, 693–733.zbMATHCrossRefGoogle Scholar
  56. 56.
    Smith, K. I., Everson, R. M., Fieldsend, J. E., Murphy, C., & Misra, R. (2008). Dominance-based multiobjective simulated annealing. IEEE Transactions on Evolutionary Computation, 12(3), 323–342.CrossRefGoogle Scholar
  57. 57.
    Szu, H. H., & Hartley, R. L. (1987). Nonconvex optimization by fast simulated annealing. Proceedings of the IEEE, 75, 1538–1540.Google Scholar
  58. 58.
    Tank, D. W., & Hopfield, J. J. (1986). Simple “neural” optimization networks: An A/D converter, signal decision circuit, and a linear programming circuit. IEEE Transactions on Circuits and Systems, 33, 533–541.CrossRefGoogle Scholar
  59. 59.
    Thompson, D. R., & Bilbro, G. L. (2005). Sample-sort simulated annealing. IEEE Transactions on Systems, Man, and Cybernetics, Part B, 35(3), 625–632.CrossRefGoogle Scholar
  60. 60.
    Tsallis, C., & Stariolo, D. A. (1996). Generalized simulated annealing. Physica A, 233, 395–406.CrossRefGoogle Scholar
  61. 61.
    Wang, J., & Li, H. (1994). Solving simultaneous linear equations using recurrent neural networks. Information Sciences, 76, 255–277.zbMATHCrossRefGoogle Scholar
  62. 62.
    Wang, L., & Smith, K. (1998). On chaotic simulated annealing. IEEE Transactions on Neural Networks, 9, 716–718.CrossRefGoogle Scholar
  63. 63.
    Wang, L., Li, S., Tian, F., & Fu, X. (2004). A noisy chaotic neural network for solving combinatorial optimization problems: Stochastic chaotic simulated annealing. IEEE Transactions on Systems, Man, and Cybernetics, Part B, 34(5), 2119–2125.CrossRefGoogle Scholar
  64. 64.
    Wang, R. L., Tang, Z., & Cao, Q. P. (2002). A learning method in Hopfield neural network for combinatorial optimization problem. Neurocomputing, 48, 1021–1024.zbMATHCrossRefGoogle Scholar
  65. 65.
    Wang, X. (1991). Period-doublings to chaos in a simple neural network: An analytic proof. Complex Systems, 5, 425–441.MathSciNetzbMATHGoogle Scholar
  66. 66.
    Yan, H. (1991). Stability and relaxation time of Tank and Hopfield’s neural network for solving LSE problems. IEEE Transactions on Circuits and Systems, 38(9), 1108–1110.MathSciNetCrossRefGoogle Scholar
  67. 67.
    Yuh, J. D., & Newcomb, R. W. (1993). A multilevel neural network for A/D conversion. IEEE Transactions on Neural Networks, 4(3), 470–483.CrossRefGoogle Scholar
  68. 68.
    Zhang, S., & Constantinides, A. G. (1992). Lagrange programming neural networks. IEEE Transactions on Circuits and Systems II, 39(7), 441–452.Google Scholar
  69. 69.
    Zurada, J. M., Cloete, I., & van der Poel, E. (1996). Generalized Hopfield networks for associative memories with multi-valued stable states. Neurocomputing, 13, 135–149.CrossRefGoogle Scholar

Copyright information

© Springer-Verlag London Ltd., part of Springer Nature 2019

Authors and Affiliations

  1. 1.Department of Electrical and Computer EngineeringConcordia UniversityMontrealCanada
  2. 2.Xonlink Inc.HangzhouChina

Personalised recommendations