Skip to main content

Training Data on Recursive Parallel Processors for Deep Learning

  • Conference paper
  • First Online:
Advances in Data Sciences, Security and Applications

Part of the book series: Lecture Notes in Electrical Engineering ((LNEE,volume 612))

  • 578 Accesses

Abstract

Artificial intelligence includes machine learning which further includes deep learning. Deep learning has humongous applications with extremely high accuracy. But deep learning models work for specific tasks only. Even running time is high from hours to days. They lack flexibility. They have no generic intelligence. They are computationally intensive. The focus of this paper is to train training data on recursive parallel processors (or supercomputers) for deep learning. This trained data would be helpful in fulfilling the mentioned challenges along with handling various application domains.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Chen W, Lin W (2014) Big data deep learning: challenges and perspectives. IEEE Access 2:514–525

    Article  Google Scholar 

  2. Lakshmanaprabu SK, Shankar K et al (2019) Random forest for big data classification in the internet of things using optimal features. Int J Mach Learn Cybern 10(10):2609–2618

    Article  Google Scholar 

  3. Cook D (2017) Practical machine learning with H2O. Shroff Publishers and Distributors

    Google Scholar 

  4. Mueller JP et al (2016) Machine learning in python and R for dummies. Wiley India, New Delhi

    Google Scholar 

  5. Buduma N (2017) Fundamentals of deep learning using H2O. Shroff Publishers and Distributors, Bengaluru

    Google Scholar 

  6. Keuper J, Pfreundt F-J (2016) Distributed training of deep neural networks: theoretical and practical limits of parallel scalability

    Google Scholar 

  7. Hegde V, Usmani S (2016) Parallel and distributed deep learning

    Google Scholar 

  8. Li KLX, Zhang G, Zheng W. Deep learning and its parallelization

    Google Scholar 

  9. Seide F, Fu H, Droppo J, Li G, Yu D (2014) On parallelizability of stochastic gradient descent for speech DNNs. IEEE international conference on acoustics, speech and signal processing (ICASSP). IEEE, pp 235–239

    Google Scholar 

  10. Schmidhuber J (2015) Deep learning in neural networks—an overview, pp 85–117

    Article  Google Scholar 

  11. Demmel J, Eliahu D, Fox A, Kamil S, Lipshitz B, Schwartz O, Spillinger O (2013) Communication-optimal parallel recursive rectangular matrix multiplication. In: IEEE 27th international symposium on parallel & distributed processing (IPDPS). IEEE, pp 261–272

    Google Scholar 

  12. Spring R, Shrivastava A (2016) Scalable and sustainable deep learning via randomized hashing. arXiv preprint arXiv

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rajiv Chopra .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Raheja, S., Chopra, R. (2020). Training Data on Recursive Parallel Processors for Deep Learning. In: Jain, V., Chaudhary, G., Taplamacioglu, M., Agarwal, M. (eds) Advances in Data Sciences, Security and Applications. Lecture Notes in Electrical Engineering, vol 612. Springer, Singapore. https://doi.org/10.1007/978-981-15-0372-6_5

Download citation

Publish with us

Policies and ethics