Skip to main content

The Neural Network for Online Learning Task Without Manual Feature Extraction

  • Conference paper
  • First Online:
Book cover Advances in Neural Networks – ISNN 2019 (ISNN 2019)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11554))

Included in the following conference series:

Abstract

The article is devoted to the problem of feature extraction in online learning tasks. In many cases, the proper feature extraction is very time-consuming. Currently, in some cases, this problem is successfully solved by deep neural networks. However, deep models are computationally expensive and so hardly applicable for online learning tasks which require frequent updating of the model. This paper proposes the lightweight neural net architecture that can be learned in online mode and doesn’t require complex handcrafted features. The small sample processing time distinguishes the proposed model from more complex deep neural networks. The architecture and learning process of the proposed model are discussed in detail. The special attention is paid to fast software implementation. On benchmarks, we show that developed implementation processes one sample several times faster than implementations on the base of deep learning frameworks. The conducted experiments on CTR prediction task show that the proposed neural net with raw features gives the same performance as the logistic regression model with handcrafted features. For a clear description of the proposed architecture, we use the metagraph approach.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Guyon, I., Gunn, S., Nikravesh, M., Zadeh, L.A. (eds.): Feature Extraction: Foundations and Applications. Studies in Fuzziness and Soft Computing. Springer, Heidelberg (2008)

    Google Scholar 

  2. Manning, C.D., Raghavan, P., Schutze, H.: Introduction to Information Retrieval. University Press, Cambridge (2010)

    Google Scholar 

  3. Peng, H., Long, F., Ding, C.: Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 27, 1226–1238 (2005)

    Google Scholar 

  4. LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521, 436–444 (2015)

    Google Scholar 

  5. Bishop, C.: Pattern Recognition and Machine Learning. Springer, Heidelberg (2006). https://doi.org/10.1007/978-1-4615-7566-5

    Google Scholar 

  6. Ruder, S.: An Overview of Gradient Descent Optimization Algorithms (2016)

    Google Scholar 

  7. http://ruder.io/optimizing-gradient-descent/

  8. Fedorenko, Y.S., Gapanyuk, Y.E.: The neural network with automatic feature selection for solving problems with categorical variables. In: Kryzhanovsky, B., Dunin-Barkowski, W., Redko, V., Tiumentsev, Y. (eds.) NEUROINFORMATICS 2018. SCI, vol. 799, pp. 129–135. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-01328-8_13

    Google Scholar 

  9. Lasagne Documentation (2019). https://lasagne.readthedocs.io/en/latest/

  10. PyTorch Documentation (2019). https://pytorch.org/docs/stable/index.html

  11. Obuchowski, N.A., Bullen, J.A.: Receiver operating characteristic (ROC) curves: review of methods with applications in diagnostic medicine. Phys. Med. Biol. 63, 1361–1367 (2018)

    Google Scholar 

  12. He, X., et al.: Practical lessons from predicting clicks on ads at Facebook. In: Eighth International Workshop on Data Mining for Online Advertising, ADKDD 2014, pp. 1–9. ACM, New York (2014)

    Google Scholar 

  13. Fedorenko, Y.S., Gapanyuk, Y.E.: Multilevel neural net adaptive models using the metagraph approach. Opt. Mem. Neural Netw. 25(4), 228–235 (2016). https://doi.org/10.3103/S1060992X16040020

    Google Scholar 

  14. Fedorenko, Y.S., Gapanyuk, Y.E., Minakova, S.V.: The analysis of regularization in deep neural networks using metagraph approach. In: Kryzhanovsky, B., Dunin-Barkowski, W., Redko, V. (eds.) Advances in Neural Computation, Machine Learning, and Cognitive Research, pp. 3–8. Springer International Publishing, Cham (2018). https://doi.org/10.1007/978-3-319-66604-4_1

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yuriy Gapanyuk .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Fedorenko, Y., Chernenkiy, V., Gapanyuk, Y. (2019). The Neural Network for Online Learning Task Without Manual Feature Extraction. In: Lu, H., Tang, H., Wang, Z. (eds) Advances in Neural Networks – ISNN 2019. ISNN 2019. Lecture Notes in Computer Science(), vol 11554. Springer, Cham. https://doi.org/10.1007/978-3-030-22796-8_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-22796-8_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-22795-1

  • Online ISBN: 978-3-030-22796-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics