Skip to main content

Performance Analysis of Deep Neural Network and Stacked Autoencoder for Image Classification

  • Chapter
  • First Online:

Part of the book series: EAI/Springer Innovations in Communication and Computing ((EAISICC))

Abstract

There are many challenging impacts on real-time image classification problems like extraction of features from a noisy and uncertainty existence. Task-based feature extraction is not possible for all the cases; to overcome this automatic feature extraction is included in the layers of the deep neural network and stacked autoencoder (SAE), which improves classification accuracy and speed. In this paper image datasets such as MNIST are taken, and it is trained and tested using two different networks. The time consumed and accuracy during the training period are calculated for the MNIST images applying the DNN algorithm. On the other hand, a stacked autoencoder (SAE) is constructed which is trained one layer at a time. Here the SAE consist of three layers which are stacked together, and its parameters are varied in such a way that the constructed SAE outperforms the DNN model. The SAE model improves the validation set accuracy by a noticeable margin. This paper demonstrates the effectiveness of using the SAE model over DNN with the performance analysis of binary handwritten image with time and accuracy trade-off.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD   129.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  • Arulmurugan, R., Sabarmathi, K. R., & Anandakumar, H. (2017). Classification of sentence level sentiment analysis using cloud machine learning techniques. Cluster Computing. https://doi.org/10.1007/s10586-017-1200-1.

  • Ba, B. J., & Frey, B. (2013). Adaptive dropout for training deep neural networks. Proceeding of the Advances in Neural Information Processing Systems, Lake Taheo, NV, USA 3084–3092.

    Google Scholar 

  • Baldi, P. (2012). Autoencoders, unsupervised learning, and deep architectures. ICML Unsupervised and Transfer Learning, 27(37–50), 1.

    Google Scholar 

  • Dong, P. W., Yin, W., Shi, G., Wu, F., & Lu, X. (2018). Denoising prior driven deep neural network for image restoration, arXiv:1801.06756v1 [cs.CV] pp. 1–13.

    Google Scholar 

  • Du, L. Y., Shin, K. J., & Managi, S. (2018). Enhancement of land-use change modeling using convolutional neural networks and convolutional denoising autoencoders, arXiv:1803.01159v1 [stat.AP].

    Google Scholar 

  • Galloway, A., Taylor, G. W., & Moussa, M. (2018). Predicting adversarial examples with high confidence. ICML.

    Google Scholar 

  • Gottimukkula, V. C. R. (2016). Object classification using stacked autoencoder. North Dakota: North Dakota State University.

    Google Scholar 

  • Harikumar, R., Shivappriya, S.N., & Raghavan, S. (2014). Comparison of different optimization algorithms for cardiac arrhythmia classification INFORMATION - An international interdisciplinary Journal Published by International Information Institute, Tokyo, Japan, Information 17(8), 3859.

    Google Scholar 

  • Hinton, S. O., & Teh, Y.-W. (2006). A fast learning algorithm for deep belief nets. Neural Computation, 18(7), 1527–1554.

    Article  MathSciNet  Google Scholar 

  • Holder, J., & Gass, S. (2018). Compressing deep neural networks: A new hashing pipeline using Kac’s random walk matrices. Proceedings of the 21st International Conference on Artificial Intelligence and Statistics (AISTATS) 2018, Lanzarote, Spain. JMLR: W&CP, vol. 7X.

    Google Scholar 

  • Ishfaq, A. H., & Rubin, D. (2018). TVAE: Triplet-based variational autoencoder using metric learning, 2015 (pp. 1–4). ICLR 2018 Workshop Submission.

    Google Scholar 

  • Kohli, D., Gopalakrishnan, V., & Iyer, K. N. (2017). Learning rotation invariance in deep hierarchies using circular symmetric filters. ICASSP, Proceedings of the IEEE International Conference of Acoustics, and Speech Signal Processing (pp. 2846–2850).

    Google Scholar 

  • LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature, 521(7553), 436–444.

    Article  Google Scholar 

  • Lei, T., & Ming, L. (2016). A robot exploration strategy based on Q-learning network, IEEE International Conference on Real-time Computing and Robotics RCAR 2016 (pp. 57–62).

    Google Scholar 

  • Liu, Z. W., Liu, X., Zeng, N., Liu, Y., & Alsaadi, F. E. (2017). A survey of deep neural network architectures and their applications. Neurocomputing, 234, 11–26.

    Article  Google Scholar 

  • Liu, T., Taniguchi, K. T., & Bando, T. (2018). Defect-repairable latent feature extraction of driving behavior via a deep sparse autoencoder. Sensors, 18(2), 608.

    Article  Google Scholar 

  • Meyer, D. (2015). Introduction to Autoencoders. http://www.1-4-5.net/~dmm/papers/

  • Mohd Yassin, R., Jailani, M. S. A., Megat Ali, R., Baharom, A. H. A. H., & Rizman, Z. I. (2017). Comparison between Cascade forward and multi-layer perceptron neural networks for NARX functional electrical stimulation (FES)-based muscle model. International Journal on Advanced Science, Engineering and Information, 7(1), 215.

    Article  Google Scholar 

  • Ng, Andrew, Jiquan Ngiam, Chuan Yu Foo, Yifan Mai, Caroline Suen, Adam Coates, Andrew Maas, et al. (2015). Deep learning tutorial. http://ufldl.stanford.edu/wiki/index.php/UFLDL_Tutorial

  • Parloff, R., & Metz, J. (2016). Why deep learning is suddenly changing your life. Published electronically 28 Sept 2016. http://fortune.com/ai-artificial

  • Raith, S., et al. (2017). Artificial neural networks as a powerful numerical tool to classify specific features of a tooth based on 3D scan data. Computers in Biology and Medicine, 80, 65–76.

    Article  Google Scholar 

  • Raju, D., & Shivappriya, S. N. (2018). A review on development. In Machine Learning Algorithms and Its Resources, International Journal of Pure and Applied Mathematics Volume 118 No. 5 759–768 ISSN: 1311-8080 (printed version); ISSN: 1314–3395 (on-line version).

    Google Scholar 

  • Ruder, S. (2017). An overview of multi-task learning in deep neural networks. arXiv preprint arXiv:1706.05098.

    Google Scholar 

  • Schmitt, S., et al. (2017). Neuromorphic hardware in the loop: Training a deep spiking network on the BrainScaleS wafer-scale system. Proceedings of the International Joint Conference on Neural Networks, 2017, 2227–2234.

    Google Scholar 

  • Sun, G., Yen, G., & Yi, Z. (2017). Evolving unsupervised deep neural networks for learning meaningful representations. IEEE Transactions on Evolutionary Computation, 1. https://doi.org/10.1109/TEVC.2018.2808689.

  • Wang, X., Takaki, S., & Yamagishi, J. (2018). Investigating very deep highway networks for parametric speech synthesis. Speech Communication, 96, 1–9.

    Article  Google Scholar 

  • Yang, H. F., Lin, K., & Chen, C.-S. (2015). Supervised learning of semantics-preserving hash via deep convolutional neural networks. IEEE Transactions on Pattern Analysis and Machine Intelligence, 8828(c), 1–15 2015.intelligence-deep-machine-learning/intro_to_autoencoders.pdf. arXiv:1507.00101v2 [cs.CV] 14 Feb 2017

    Google Scholar 

  • Yu, J., Hong, C., Rui, Y., & Tao, D. (2018). Multi-task autoencoder model for recovering human poses. IEEE Transactions on Industrial Electronics, 65(6), 5060–5068.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to S. N. Shivappriya .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Shivappriya, S.N., Harikumar, R. (2019). Performance Analysis of Deep Neural Network and Stacked Autoencoder for Image Classification. In: Anandakumar, H., Arulmurugan, R., Onn, C. (eds) Computational Intelligence and Sustainable Systems. EAI/Springer Innovations in Communication and Computing. Springer, Cham. https://doi.org/10.1007/978-3-030-02674-5_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-02674-5_1

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-02673-8

  • Online ISBN: 978-3-030-02674-5

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics