Advertisement

Atomic Switch pp 201-243 | Cite as

Atomic Switch Networks for Neuroarchitectonics: Past, Present, Future

  • R. Aguilera
  • K. Scharnhorst
  • S. L. Lilak
  • C. S. Dunham
  • M. Aono
  • A. Z. Stieg
  • J. K. GimzewskiEmail author
Conference paper
  • 55 Downloads
Part of the Advances in Atom and Single Molecule Machines book series (AASMM)

Abstract

Artificial realizations of the mammalian brain alongside their integration into electronic components are explored through neuromorphic architectures, neuroarchitectectonics, on CMOS compatible platforms. Exploration of neuromorphic technologies continue to develop as an alternative computational paradigm as both capacity and capability reach their fundamental limits with the end of the transistor-driven industrial phenomenon of Moore’s law. Here, we consider the electronic landscape within neuromorphic technologies and the role of the atomic switch as a model device. We report the fabrication of an atomic switch network (ASN) showing critical dynamics and harness criticality to perform benchmark signal classification and Boolean logic tasks. Observed evidence of biomimetic behavior such as synaptic plasticity and fading memory enable the ASN to attain a cognitive capability within the context of artificial neural networks.

References

  1. 1.
    Waldrop, M.M.: The chips are down for Moore’s law. Nature. 530, 144–147 (2016)PubMedCrossRefPubMedCentralGoogle Scholar
  2. 2.
    Abbe, E.: Contributions to the Theory of the Microscope and the Microscopic Perception. Springer (1873)Google Scholar
  3. 3.
    International Technology Roadmap for Semiconductors (2015)Google Scholar
  4. 4.
    Neumann, J.V.: First draft of a report on the EDVAC. IEEE Ann. Hist. Comput. 15, 27–75 (1993)CrossRefGoogle Scholar
  5. 5.
    Backus, J.W.: Can programming be liberated from the von Neumann style? A functional style and its algebra of programs. Comm. ACM. 21, 613–641 (1978)CrossRefGoogle Scholar
  6. 6.
    Dongarra, J.: Visit to the National University for Defense Technology Changsha, China. University of Tennessee (2013)Google Scholar
  7. 7.
    Mead, C.: Neuromorphic electronic systems. In: Proceedings of the IEEE. IEEE (1990)Google Scholar
  8. 8.
    Hodgkin, A.L., Huxley, A.F.: A quantitative description of membrane current and its application to conduction and excitation in nerve. J. Physiol. 117, 500–544 (1952)PubMedPubMedCentralCrossRefGoogle Scholar
  9. 9.
    Haimovici, A., Tagliazucchi, E., Balenzuela, P., Chialvo, D.R.: Brain organization into resting state networks emerges at criticality on a model of the human connectome. Phys. Rev. Lett. 110(17), 178101 (2013)PubMedCrossRefPubMedCentralGoogle Scholar
  10. 10.
    Wang, X.F., Chen, G.: Complex networks: small-world, scale-free and beyond. IEEE Circ. Syst. Mag. 3, 6–20 (2003)CrossRefGoogle Scholar
  11. 11.
    Sporns, O.: Small-world connectivity, motif composition, and complexity of fractal neuronal connections. Biosystems. 85, 55–64 (2006)PubMedCrossRefPubMedCentralGoogle Scholar
  12. 12.
    Abbott, L.F., Nelson, S.B.: Synaptic plasticity: taming the beast. Nat. Neurosci. 3, 1178 (2000)PubMedCrossRefPubMedCentralGoogle Scholar
  13. 13.
    Hebb, D.O.: Organization of Behavior. Wiley, New York (1950)Google Scholar
  14. 14.
    Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychol. Rev. 65, 386–408 (1958)PubMedCrossRefGoogle Scholar
  15. 15.
    Hopfield, J.J.: Artificial neural networks. IEEE Circ. Dev. Mag. 4, 3–10 (1988)CrossRefGoogle Scholar
  16. 16.
    Gomes, L.: Neuromorphic chips are destined for deep learning—or obscurity. IEEE Spectrum (2017)Google Scholar
  17. 17.
    Schuman, C.D., Potok, T.E., Patton, R.M., Douglas Birdwell, J., Dean, M.E., Rose, G.S., Plank, J.S.: A survey of neuromorphic computing and neural networks in hardware. arXiv (2017)Google Scholar
  18. 18.
    Christie, P., Stroobandt, D.: The interpretation and application of Rent’s rule. IEEE Trans. VLSI Syst. 8, 639–648 (2000)CrossRefGoogle Scholar
  19. 19.
    Abraham, A.: Artificial neural networks. In: Sydenham, P.H., Thorn, R. (eds.) Handbook of Measuring System Design. Wiley, New York (2005)Google Scholar
  20. 20.
    Medsker, L., Jain, L.C.: Recurrent Neural Networks: Design and Applications. CRC, Boca Raton, FL (2001)Google Scholar
  21. 21.
    Graves, A., Mohamed, A., Hinton, G.: Speech recognition with deep recurrent neural networks. ArXiv (2013)Google Scholar
  22. 22.
    Khotanzad, A., Chung, C.: Application of multi-layer perceptron neural networks to vision problems. Neural Comput. Appl. 7, 249–259 (1998)CrossRefGoogle Scholar
  23. 23.
    LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature. 521, 436–444 (2015)PubMedCrossRefPubMedCentralGoogle Scholar
  24. 24.
    LeCun, Y., Boser, B.E., Denker, J.S., Henderson, D., Howard, R.E., Hubbard, W.E., Jackel, L.D.: Handwritten digit recognition with a back-propagation network. In: Touretsky, D.S. (ed.) Advances in Neural Information Processing Systems. Morgan Kaufmann, San Mateo (1990)Google Scholar
  25. 25.
    Hinton, G.E.: Learning multiple layers of representation. Trends Cogn. Sci. 11(10), 428–434 (2007)PubMedCrossRefPubMedCentralGoogle Scholar
  26. 26.
    Büsing, L., Schrauwen, B., Legenstein, R.: Connectivity, dynamics, and memory in reservoir computing with binary and analog neurons. Neural Comput. 22, 1272–1311 (2010)PubMedCrossRefPubMedCentralGoogle Scholar
  27. 27.
    Sojakka, C. F.: Pattern recognition in a bucket. In: Wolfgang Banzhaf, J. Z. (ed.) European Conference on Artificial Life: Advances in Artificial Life (2003)Google Scholar
  28. 28.
    Goudarzi, A., Teuscher, C., Gulbahce, N., Rohlf, T.: Emergent criticality through adaptive information processing in Boolean networks. Phys. Rev. Lett. 108, 128702 (2012)PubMedCrossRefPubMedCentralGoogle Scholar
  29. 29.
    Tour, J.M., Cheng, L., Nackashi, D.P., Yao, Y., Flatt, A.K., Angelo, S.K.S., Mallouk, T.E., Franzon, P.D.: Nanocell electronic memories. J. Am. Chem. Soc. 125, 13279–13283 (2003)PubMedCrossRefPubMedCentralGoogle Scholar
  30. 30.
    Merolla, P.A., Arthur, J.V., Alvarez-Icaza, R., Cassidy, A.S., Sawada, J., et al.: A million spiking-neuron integrated circuit with a scalable communication network and interface. Science. 345, 668–673 (2014)PubMedCrossRefGoogle Scholar
  31. 31.
    Indiveri, G., Linares-Barranco, B., Hamilton, T., van Schaik, A., Etienne-Cummings, R., et al.: Neuromorphic silicon neuron circuits. Front. Neurosci. 5, 1–23 (2011)Google Scholar
  32. 32.
    Shimokawa, Y., Fuwa, Y., Aramaki N. A parallel ASIC VLSI neurocomputer for a large number of neurons and billion connections per second speed. In: IEEE International Joint Conference on Neural Networks. Singapore (1991)Google Scholar
  33. 33.
    Omondi, A.R., Rajapakse, J.C.: FPGA Implementations of Neural Networks. Springer, Dordrecht (2006)CrossRefGoogle Scholar
  34. 34.
    Nurvitadhi, E., Sheffield, D., Sim, J., Mishra, A., Venkatesh, G., Marr, D. Accelerating binarized neural networks: comparison of FPGA, CPU, GPU, and ASIC. In: International Conference on Field-Programmable Technology (FPT), IEEE (2016)Google Scholar
  35. 35.
    Qiao, Y., Shen, J., Xiao, T., Yang, Q., Wen, M., Zhang, C.: FPGA-accelerated deep convolutional neural networks for high throughput and energy efficiency. Concurr. Comput. Pract. Exp. 29 (2016)CrossRefGoogle Scholar
  36. 36.
    NVIDIA launches the world’s first graphics processing unit: GeForce 256 [Online]. http://www.nvidia.com/object/IO_20020111_5424.html (1999)
  37. 37.
    Jager, C.: Nvidia unveils Volta: the most powerful GPU ever [online]. https://www.lifehacker.com.au/2017/05/nvidias-unveils-volta-gv100-the-most-powerful-gpu-ever/ (2017)
  38. 38.
    Krizhevsky, A., Sutskever, I., Hinton, G. E.: Imagenet classification with deep convolutional neural networks. In: Advances in Neural Information Processing Systems (2012)Google Scholar
  39. 39.
    Saxena, A.: Deep learning pioneers boost research at NVIDIA AI labs around the world [online]. https://blogs.nvidia.com/blog/2017/07/11/deep-learning-pioneers-boost-research-at-nvidia-ai-labs-around-the-world/ (2017)
  40. 40.
    Romero, A., et al.: Diet networks: thin parameters for fat genomics. ArXiv (2017)Google Scholar
  41. 41.
    Finn, C., Levine, S.: Deep visual foresight for planning robot motion. ArXiv (2017)Google Scholar
  42. 42.
  43. 43.
    Qualcomm helps make your mobile devices smarter with new Snapdragon machine learning software development kit. https://www.qualcomm.com/news/releases/2016/05/02/qualcomm-helps-make-your-mobile-devices-smarter-new-snapdragon-machine (2016)
  44. 44.
    Avizienis, A.V., Sillin, H.O., Martin-Olmos, C., Shieh, H.H., Aono, M., Stieg, A.Z., Gimzewski, J.K.: Neuromorphic atomic switch networks. PLoS One. 7(8), e42772 (2012)PubMedPubMedCentralCrossRefGoogle Scholar
  45. 45.
    Yang, J.J., Strukov, D.B., Stewart, D.R.: Memristive devices for computing. Nat. Nanotechnol. 8, 13–24 (2013)PubMedPubMedCentralCrossRefGoogle Scholar
  46. 46.
    Stieg, A.Z., et al.: Self-organization and emergence of dynamical structures in neuromorphic atomic switch networks. In: Adamatzky, A., Chua, L. (eds.) Memristor Networks. Springer, Cham (2014)Google Scholar
  47. 47.
    Demis, E.C., Aguilera, R., Sillin, H.O., Scharnhorst, K., Sandouk, E.J., Aono, M., Stieg, A.Z., Gimzewski, J.K.: Atomic switch networks nanoarchitectonic design of a complex system for natural computing. Nanotechnology. 26, 204003 (2015)PubMedCrossRefPubMedCentralGoogle Scholar
  48. 48.
    Demis, E.C., Aguilera, R., Scharnhorst, K., Aono, M., Stieg, A.Z., Gimzewski, J.K.: Nanoarchitectonic atomic switch networks for unconventional computing. Jpn. J. Appl. Phys. 55, 1102B2 (2016)CrossRefGoogle Scholar
  49. 49.
    Sillin, H.O., Aguilera, R., Shieh, H.H., Avizienis, A.V., Aono, M., Stieg, A.Z., Gimzewski, J.K.: A theoretical and experimental study of neuromorphic atomic switch networks for reservoir computing. Nanotechnology. 24, 384004 (2013)PubMedCrossRefPubMedCentralGoogle Scholar
  50. 50.
    Scharnhorst, K.S., Carbajal, J.P., Aguilera, R.C., Sandouk, E.J., Aono, M., Stieg, A.Z., Gimzewski, J.K.: Atomic switch networks as complex adaptive systems. Jpn. J. Appl. Phys. 57, 03ED02 (2018)CrossRefGoogle Scholar
  51. 51.
    Langton, C.G.: Computation at the edge of chaos: phase transitions and emergent computation. Phys. D. 42, 12–37 (1990)CrossRefGoogle Scholar
  52. 52.
    Gimzewski, J.K., Möller, R.: Transition from the tunneling regime to point contact studied using scanning tunneling microscopy. Phys. Rev. B. 36(2), 1284–1287 (1987)CrossRefGoogle Scholar
  53. 53.
    Lang, N.D.: Theory of single-atom imaging in the scanning tunneling microscope. Phys. Rev. Lett. 56, 1164–1167 (1986)PubMedCrossRefPubMedCentralGoogle Scholar
  54. 54.
    van Houton, H., Beenakker, C.: Quantum point contacts. Phys. Today. 49(7), 22–27 (1996)CrossRefGoogle Scholar
  55. 55.
    Terabe, K., Nakayama, T., Hasegawa, T., Aono, M.: Formation and disappearance of a nanoscale silver cluster realized by solid electrochemical reaction. J. Appl. Phys. 91, 10110–10114 (2002)CrossRefGoogle Scholar
  56. 56.
    NEC. NEC integrates nanobridge in the Cu interconnects of Si LSI. https://phys.org/news/2009-12-nec-nanobridge-cu-interconnects-si.html (2009)
  57. 57.
    Ohno, T., Hasegawa, T., Tsuruoka, T., Terabe, K., Gimzewski, J.K., Aono, M.: Short-term plasticity and long-term potentiation mimicked in single inorganic synapses. Nat. Mater. 10, 591–595 (2011)PubMedPubMedCentralCrossRefGoogle Scholar
  58. 58.
    Hasegawa, T., Nayak, A., Ohno, T., Terabe, K., Tsuruoka, T., Gimzewski, J.K., Aono, M.: Memristive operations demonstrated by gap-type atomic switches. Appl. Phys. A. 102, 811–815 (2011)CrossRefGoogle Scholar
  59. 59.
    Avizienis, A.V., Martin-Olmos, C., Sillin, H.O., Aono, M., Gimzewski, J.K., Stieg, A.Z.: Morphological transitions from dendrites to nanowires in the electroless deposition of silver. Cryst. Growth Des. 13(2), 465–469 (2013)CrossRefGoogle Scholar
  60. 60.
    Stieg, A.Z., Avizienis, A.V., Sillin, H.O., Martin-Olmos, C., Aono, M., Gimzewski, J.K.: Emergent criticality in complex turing B-type atomic switch networks. Adv. Mater. 24, 286–293 (2011)CrossRefGoogle Scholar
  61. 61.
    Oskoee, E.N., Sahimi, M.: Electric currents in networks of interconnected memristors. Phys. Rev. E. 83, 031105 (2011)CrossRefGoogle Scholar
  62. 62.
    Goudarzi, A., Lakin, M.R., Stefanovic, D., Teuscher, C.: A model for variation-and fault-tolerant digital logic using self-assembled nanowire architectures. In: IEEE/ACM International Symposium on Nanoscale Architectures. ACM, pp. 116–121 (2014)Google Scholar
  63. 63.
    Verstraeten, D.: Reservoir computing: computation with dynamical systems. PhD thesis, Ghent University (2009)Google Scholar
  64. 64.
    Legenstein, R., Maass, W.: What makes a dynamical system computationally powerful? In: Haykin, S., Principe, J.C., Sejnowski, T.J., McWhirter, J. (eds.) New Directions in Statistical Signal Processing: From Systems to Brain. MIT Press, Cambridge, MA (2005)Google Scholar
  65. 65.
    Lukoševičius, M., Jaeger, H.: Reservoir computing approaches to recurrent neural network training. Comput. Sci. Rev. 3, 127–149 (2009)CrossRefGoogle Scholar
  66. 66.
    Wyffels, F., Schrauwen, B.: A comparative study of reservoir computing strategies for monthly time series prediction. Neurocomputing. 73, 1958–1964 (2010)CrossRefGoogle Scholar
  67. 67.
    Castro, L.N.D.: Fundamentals of natural computing: an overview. Phys. Life Rev. 4, 1–36 (2007)CrossRefGoogle Scholar
  68. 68.
    Modha, D.S., Ananthanarayanan, R., Esser, S.K., Ndirango, A., Sherbondy, A., Singh, R.: Cognitive computing. Commun. ACM. 54, 62–71 (2011)CrossRefGoogle Scholar
  69. 69.
    Yu, S., Kuzum, K., Philip Wong, H. S.: Design considerations of synaptic device for neuromorphic computing. In: IEEE International Symposium on Circuits and Systems, Melbourne, VIC. IEEE, pp 1062–1065 (2014)Google Scholar
  70. 70.
    Schrauwen, B., Verstraeten, D., Van Campenhout, J.: An overview of reservoir computing: theory, applications and implementations. In: 15th European Symposium on Artificial Neural Networks, pp. 471–482 (2007)Google Scholar
  71. 71.
    Bürger, J., Goudarzi, A., Stefanovic, D., Teuscher, C.: Hierarchical composition of memristive networks for real-time computing. In: IEEE/ACM International Symposium on Nanoscale Architectures (NANOARCH). IEEE (2015)Google Scholar
  72. 72.
    Gacem, K., Retrouvey, J.M., Chabi, D., Filoramo, A., Zhao, W., Klein, J.O., Derycke, V.: Neuromorphic function learning with carbon nanotube based synapses. Nanotechnology. 24, 384013 (2013)PubMedCrossRefPubMedCentralGoogle Scholar
  73. 73.
    Snyder, D., Goudarzi, A., Teuscher, C.: Computational capabilities of random automata networks for reservoir computing. Phys. Rev. E. 87, 042808 (2013)CrossRefGoogle Scholar
  74. 74.
    Carbajal, J.P., Dambre, J., Hermans, M., Schrauwen, B.: Memristor models for machine learning. Neural Comput. 27, 725–747 (2015)PubMedCrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • R. Aguilera
    • 1
  • K. Scharnhorst
    • 1
  • S. L. Lilak
    • 1
  • C. S. Dunham
    • 1
  • M. Aono
    • 2
  • A. Z. Stieg
    • 3
    • 4
  • J. K. Gimzewski
    • 1
    • 3
    • 4
    Email author
  1. 1.Department of Chemistry and BiochemistryUCLALos AngelesUSA
  2. 2.International Center for Materials Nanoarchitectonics (MANA)National Institute for Materials Science (NIMS)TsukubaJapan
  3. 3.WPI Center for Materials Nanoarchitectonics (MANA)National Institute for Materials Science (NIMS)TsukubaJapan
  4. 4.California NanoSystems Institute (CNSI)UCLALos AngelesUSA

Personalised recommendations