Skip to main content

Efficient Neuromorphic Systems and Emerging Technologies: Prospects and Perspectives

  • Chapter
  • First Online:

Abstract

Recent advances in machine learning, notably deep learning, have resulted in unprecedented success in a wide variety of recognition tasks including vision, speech, and natural language processing. However, implementation of such neural algorithms in conventional “von-Neumann” architectures involve orders of magnitude more area and power consumption than that involved in the biological brain. This is mainly attributed to the inherent mismatch between the computational units—neurons and synapses in such models and the underlying CMOS transistors. In addition, these algorithms, being highly memory-intensive, suffer from memory bandwidth limitations due to significant amount of data transfer between the memory and computing units. Recent experiments in spintronics have opened up the possibility of implementing such computing kernels by single device structures that can be arranged in crossbar architectures resulting in a compact and energy-efficient “in-memory computing” platform. In this chapter, we will review spintronic device structures consisting of single-domain/domain-wall motion based devices for mimicking neuronal and synaptic units. System-level simulations indicate ∼ 100× improvement in energy consumption for such spintronic implementations over a corresponding CMOS implementation across different computing workloads.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. K. Simonyan, A. Zisserman, Very deep convolutional networks for large-scale image recognition (2014). arXiv preprint arXiv:1409.1556

    Google Scholar 

  2. T. Mikolov, M. Karafiát, L. Burget, J. Cernockỳ, S. Khudanpur, Recurrent neural network based language model.Interspeech 2, 3 (2010)

    Google Scholar 

  3. Y. LeCun, L. Bottou, Y. Bengio, P. Haffner, Gradient-based learning applied to document recognition. Proc. IEEE 86 (11), 2278–2324 (1998)

    Article  Google Scholar 

  4. A. Krizhevsky, I. Sutskever, G.E. Hinton, Imagenet classification with deep convolutional neural networks, in Advances in Neural Information Processing Systems (2012), pp. 1097–1105

    Google Scholar 

  5. Y. Taigman, M. Yang, M. Ranzato, L. Wolf, Deepface: closing the gap to human-level performance in face verification, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2014), pp. 1701–1708

    Google Scholar 

  6. M. Rhu, N. Gimelshein, J. Clemons, A. Zulfiqar, S.W. Keckler, vDNN: Virtualized deep neural networks for scalable, memory-efficient neural network design (2016). arXiv preprint arXiv:1602.08124

    Google Scholar 

  7. Y. Chen, T. Luo, S. Liu, S. Zhang, L. He, J. Wang, L. Li, T. Chen, Z. Xu, N. Sun et al., Dadiannao: A machine-learning supercomputer, in Proceedings of the 47th Annual IEEE/ACM International Symposium on Microarchitecture (IEEE Computer Society, 2014), pp. 609–622

    Google Scholar 

  8. M. Julliere, Tunneling between ferromagnetic films. Phys. Lett. A 54 (3), 225–226 (1975)

    Article  Google Scholar 

  9. J.C. Slonczewski, Conductance and exchange coupling of two ferromagnets separated by a tunneling barrier. Phys. Rev. B 39 (10), 6995 (1989)

    Google Scholar 

  10. J. Hirsch, Spin hall effect. Phys. Rev. Lett. 83 (9), 1834 (1999)

    Google Scholar 

  11. C.-F. Pai, L. Liu, Y. Li, H. Tseng, D. Ralph, R. Buhrman, Spin transfer torque devices utilizing the giant spin Hall effect of tungsten. Appl. Phys. Lett. 101 (12), 122404 (2012)

    Google Scholar 

  12. L. Liu, C.-F. Pai, Y. Li, H. Tseng, D. Ralph, R. Buhrman, Spin-torque switching with the giant spin Hall effect of tantalum. Science 336 (6081), 555–558 (2012)

    Article  Google Scholar 

  13. A. Sengupta, S.H. Choday, Y. Kim, K. Roy, Spin orbit torque based electronic neuron. Appl. Phys. Lett. 106 (14), 143701 (2015)

    Google Scholar 

  14. S. Emori, U. Bauer, S.-M. Ahn, E. Martinez, G.S. Beach, Current-driven dynamics of chiral ferromagnetic domain walls. Nat. Mater. 12 (7), 611–616 (2013)

    Article  Google Scholar 

  15. S. Emori, E. Martinez, K.-J. Lee, H.-W. Lee, U. Bauer, S.-M. Ahn, P. Agrawal, D.C. Bono, G.S. Beach, Spin Hall torque magnetometry of Dzyaloshinskii domain walls. Phys. Rev. B 90 (18), 184427 (2014)

    Google Scholar 

  16. G. Indiveri, A low-power adaptive integrate-and-fire neuron circuit, in ISCAS (4). Citeseer (2003), pp. 820–823

    Google Scholar 

  17. P.A. Merolla, J.V. Arthur, R. Alvarez-Icaza, A.S. Cassidy, J. Sawada, F. Akopyan, B.L. Jackson, N. Imam, C. Guo, Y. Nakamura et al., A million spiking-neuron integrated circuit with a scalable communication network and interface. Science 345 (6197), 668–673 (2014)

    Article  Google Scholar 

  18. A. Sengupta, Y. Shim, K. Roy, Proposal for an All-Spin Artificial Neural Network: emulating neural and synaptic functionalities through domain wall motion in ferromagnets, in IEEE Transactions on Biomedical Circuits and Systems (2016)

    Google Scholar 

  19. A. Sengupta, M. Parsa, B. Han, K. Roy, Probabilistic deep spiking neural systems enabled by magnetic tunnel junction. IEEE Trans. Electron Dev. 63 (7), 2963–2970 (2016)

    Article  Google Scholar 

  20. A. Sengupta, P. Panda, P. Wijesinghe, Y. Kim, K. Roy, Magnetic tunnel junction mimics stochastic cortical spiking neurons. Sci. Rep. 6, 30039 (2016)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kaushik Roy .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this chapter

Cite this chapter

Sengupta, A., Ankit, A., Roy, K. (2017). Efficient Neuromorphic Systems and Emerging Technologies: Prospects and Perspectives. In: Chattopadhyay, A., Chang, C., Yu, H. (eds) Emerging Technology and Architecture for Big-data Analytics. Springer, Cham. https://doi.org/10.1007/978-3-319-54840-1_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-54840-1_12

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-54839-5

  • Online ISBN: 978-3-319-54840-1

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics