Shallow Network Performance in an Increasing Image Dimension

  • Mohd Razif ShamsuddinEmail author
  • Shuzlina Abdul-Rahman
  • Azlinah Mohamed
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 652)


This paper describes the performance of a shallow network towards increasing complexity of dimension in an image input representation. This paper will highlight the generalization problem in Shallow Neural Network despite its extensive usage. In this experiment, a backpropagation algorithm is chosen to test the network as it is widely used in many classification problems. A set of three different size of binary images are used in this experiment. The idea is to assess how the network performs as the scale of the input dimension increases. In addition, a benchmark MNIST handwritten digit sampling is also used to test the performance of the shallow network. The result of the experiment shows the network performance as the scale of input increases. The result is then discussed and explained. From the conducted experiments it is believed that the complexity of the input size and breadth of the network affects the performance of the Neural Network. Such results can be as a reference and guidance to people that is interested in doing research using backpropagation algorithm.


Neural Network Shallow network Backpropagation Image recognition 


  1. 1.
    Mona, M.M., Amr, B., Abdelhalim, M.B.: Image classification and retrieval using optimized pulse-coupled neural network. Expert Syst. Appl. 42(11), 4927–4936 (2015)CrossRefGoogle Scholar
  2. 2.
    Tara, N.S., Brian, K., George, S., Hagen, S., Abdel-rahman, M., George, D., Bhuvana, R.: Deep convolutional neural networks for large-scale speech tasks. Neural Netw. 64, 39–48 (2014)Google Scholar
  3. 3.
    Iturriaga, F.J.L., Sanz, I.P.: Bankruptcy visualization and prediction using neural networks: a study of U.S. commercial banks. Expert Syst. Appl. 42(6), 2857–2869 (2015)CrossRefGoogle Scholar
  4. 4.
    Khosravi, A., Nahavandi, S., Creighton, D., Atiya, A.F.: Comprehensive review of neural network-based prediction intervals and new advances. IEEE Trans. Neural Netw. 22(9), 1341–1356 (2011)CrossRefGoogle Scholar
  5. 5.
    Kumar, K., Thakur, G.S.M.: Advanced applications of neural networks and artificial intelligence: a review. Int. J. Inf. Technol. Comput. Sci. 6, 57–68 (2012)CrossRefGoogle Scholar
  6. 6.
    Rolls, E.T., Deco, G.: Networks for memory, perception, and decision-making, and beyond to how the syntax for language might be implemented in the brain. Brain Res. 1621, 316–334 (2014)CrossRefGoogle Scholar
  7. 7.
    Hardalaç, F.: Classification of educational backgrounds of students using musical intelligence and perception with the help of genetic neural networks. Expert Syst. Appl. 36(3, Part 2), 6708–6713 (2009)CrossRefGoogle Scholar
  8. 8.
    Biswas, S.K., Sinha, N., Purakayastha, B., Marbaniang, L.: Hybrid expert system using case based reasoning and neural network for classification. Biologically Inspired Cogn. Architectures 9, 57–70 (2014)CrossRefGoogle Scholar
  9. 9.
    Farahani, M.: Intelligent control of SVC using wavelet neural network to enhance transient stability. Eng. Appl. Artif. Intell. 26(1), 273–280 (2013)CrossRefGoogle Scholar
  10. 10.
    Miljković, Z., Mitić, M., Lazarević, M., Babić, B.: Neural network reinforcement Learning for visual control of robot manipulators. Expert Syst. Appl. 40(5), 1721–1736 (2013)CrossRefGoogle Scholar
  11. 11.
    Khan, M.N.A.: Performance analysis of Bayesian networks and neural networks in classification of file system activities. Comput. Secur. 31, 391–401 (2012)CrossRefGoogle Scholar
  12. 12.
    Fausett, L.: Fundamentals of Neural Network. Prentice-Hall, Upper Saddle River (1994)zbMATHGoogle Scholar
  13. 13.
    Palm, G.: On associative memory. Biol. Cybern. 36, 19–31 (1980)CrossRefzbMATHGoogle Scholar
  14. 14.
    Hopfield, J.J.: Neural networks and physical systems with emergent collective computational abilities. Proc. Nat. Acad. Sci. 79, 2554–2558 (1982)MathSciNetCrossRefGoogle Scholar
  15. 15.
    LeCun, Y., Boser, B., Denker, J.S., Henderson, D., Howard, R.E., Hubbard, W., Jackel, L.D.: Back-propagation applied to handwritten zip code recognition. Neural Comput. 1(4), 541–551 (1989)CrossRefGoogle Scholar
  16. 16.
    LeCun, Y., Boser, B., Denker, J.S., Henderson, D., Howard, R.E., Hubbard, W., Jackel, L.D.: Handwritten digit recognition with a back-propagation network. In: Touretzky, D.S. (ed.) Advances in Neural Information Processing Systems, vol. 2, pp. 396–404 (1990)Google Scholar
  17. 17.
    LeCun, Y., Bottou, L., Bengio, Y., Haffner, P.: Gradient-based learning applied to document recognition. Proc. IEEE 86(11), 2278–2324 (2006)CrossRefGoogle Scholar
  18. 18.
    Hinton, G.E., Osindero, S., Teh, Y.-W.: A fast learning algorithm for deep belief nets. Neural Comput. 18(7), 1527–1554 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  19. 19.
    Warren, S., Mcculloch, W.P.: A logical calculus of the ideas immanent in nervous activity 5, 115–133 (1943)Google Scholar
  20. 20.
    Jeff, H.: Introduction to Neural Networks for Java, 2nd edn., pp. 1–440 (2008)Google Scholar
  21. 21.
    Abdul-Rahman, S., Bakar, A.A., Mohamed-Hussein, Z.-A.: An intelligent data pre-processing of complex datasets. Intell. Data Anal. 16, 305–325 (2012)Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2016

Authors and Affiliations

  • Mohd Razif Shamsuddin
    • 1
    Email author
  • Shuzlina Abdul-Rahman
    • 1
  • Azlinah Mohamed
    • 1
  1. 1.Faculty of Computer Sciences and MathematicsUniversiti Teknologi MARAShah AlamMalaysia

Personalised recommendations