Skip to main content

An Improved Self-Structuring Neural Network

  • Conference paper
  • First Online:
Trends and Applications in Knowledge Discovery and Data Mining (PAKDD 2016)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 9794))

Included in the following conference series:

Abstract

Creating a neural network based classification model is traditionally accomplished using the trial and error technique. However, the trial and error structuring method nornally suffers from several difficulties including overtraining. In this article, a new algorithm that simplifies structuring neural network classification models has been proposed. It aims at creating a large structure to derive classifiers from the training dataset that have generally good predictive accuracy performance on domain applications. The proposed algorithm tunes crucial NN model thresholds during the training phase in order to cope with dynamic behavior of the learning process. This indeed may reduce the chance of overfitting the training dataset or early convergence of the model. Several experiments using our algorithm as well as other classification algorithms, have been conducted against a number of datasets from University of California Irvine (UCI) repository. The experiments’ are performed to assess the pros and cons of our proposed NN method. The derived results show that our algorithm outperformed the compared classification algorithms with respect to several performance measures.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Abdelhamid, N., Ayesh, A., Thabtah, F.: Phishing detection based associative classification data mining. Expert Syst. Appl. 41(13), 5948–5959 (2014)

    Article  Google Scholar 

  2. Ash, T.: Dynamic node creation in backpropagation networks. Connect. Sci. 1(4), 365–375 (1989)

    Article  Google Scholar 

  3. Fahlman, S.E., Lebiere, C.: The cascade-correlation learning architecture. In: Advances in Neural Information Processing Systems, pp. 524–532 (1990)

    Google Scholar 

  4. Ma, L., Khorasani, K.: A new strategy for adaptively constructing multilayer feedforward neural networks. Neurocomputing 51(1), 361–385 (2003)

    Article  Google Scholar 

  5. Basheer, I.A., Hajmeer, M.: Artificial neural networks: fundamentals, computing, design, and application. J. Microbiol. Methods 43(1), 3–31 (2000)

    Article  Google Scholar 

  6. Hall, M., et al.: Waikato Environment for Knowledge Analysis (2011). http://www.cs.waikato.ac.nz/ml/weka/

  7. Duffner, S., Garcia, C.: An online backpropagation algorithm with validation error-based adaptive learning rate. In: de Sá, J.M., Alexandre, L.A., Duch, W., Mandic, D.P. (eds.) ICANN 2007. LNCS, vol. 4668, pp. 249–258. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  8. Riley, M., Karl, J., Chris, T.: A study of early stopping, ensembling, and patchworking for cascade correlation neural networks. IAENG Int. J. Appl. Math. 40(4), 307–316 (2010)

    Google Scholar 

  9. Quinlan, J.: Data mining tools See5 and C5.0 (1998)

    Google Scholar 

  10. Friedman, N., Geiger, D., Goldszmidt, M.: Bayesian network classifiers. Mach. Learn. - Spec. Issue Learn. Probab. Represent. 29(2–3), 131–163 (1997)

    MATH  Google Scholar 

  11. Witten, I.H., Frank, E., Hall, M.A.: Data Mining: Practical Machine Learning Tools and Techniques with Java Implementations, 3rd edn. Morgan Kaufmann, Boston (2011)

    Google Scholar 

  12. Hinton, G., et al.: Deep neural networks for acoustic modeling in speech recognition. IEEE Signal Process. Mag. 29(6), 82–97 (2012)

    Article  Google Scholar 

  13. Ganatra, A., Kosta, Y.P., Panchal, G., Gajjar, C.: Initial classification through back propagation in a neural network following optimization through GA to evaluate the fitness of an algorithm. Int. J. Comput. Sci. Inf. Technol. 3(1), 98–116 (2011)

    Google Scholar 

  14. Insung, J., Wang, G.-N.: Pattern classification of back-propagation algorithm using exclusive connecting network. World Acad. Sci. 36(1), 189–193 (2007)

    Google Scholar 

  15. Mohammad, R.M., Thabtah, F., McCluskey, L.: Predicting phishing websites using neural network trained with back-propagation. In: ICAI, Las Vigas, pp. 682–686 (2013)

    Google Scholar 

  16. Lichman, M.: School of Information and Computer Sciences, University of California, Irvine (2013). http://archive.ics.uci.edu/ml/index.html

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Fadi Thabtah .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

Mohammad, R.M., Thabtah, F., McCluskey, L. (2016). An Improved Self-Structuring Neural Network. In: Cao, H., Li, J., Wang, R. (eds) Trends and Applications in Knowledge Discovery and Data Mining. PAKDD 2016. Lecture Notes in Computer Science(), vol 9794. Springer, Cham. https://doi.org/10.1007/978-3-319-42996-0_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-42996-0_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-42995-3

  • Online ISBN: 978-3-319-42996-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics