Abstract
An activation function is a very important part of an artificial neuron model. Multilayer neural networks can properly work only when these functions are nonlinear. A simple approximation of an often applied hyperbolic tangent activation function is presented. This proposed function is computationally highly effective. Computational comparisons for two well-known test problems are discussed. The results are very promising in potential applications to FPGA chips designing.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bilski, J.: The backpropagation learning with logarithmic transfer function. In: Proceedings of 5th Conference On Neural Networks and Soft Computing, Poland, pp. 71–76 (2000)
Bilski, J., Rutkowski, L.: A fast training algorithm for neural networks. IEEE Trans. Circuits Syst. II, Analog Digit. Signal Process. 45(6), 749–753 (1998)
Bilski, J.: The UD RLS algorithm for training the feedforward neural networks. Int. J. Appl. Math. Comput. Sci. 15(1), 101–109 (2005)
Bilski, J., Litwiński, S., Smola̧g, J.: Parallel realisation of QR algorithm for neural networks learning. In: Rutkowski, L., Siekmann, J.H., Tadeusiewicz, R., Zadeh, L.A. (eds.) ICAISC 2004. LNCS (LNAI), vol. 3070, pp. 158–165. Springer, Heidelberg (2004)
Bilski, J., Smola̧g, J.: Parallel realisation of the recurrent RTRN neural network learning. In: Rutkowski, L., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds.) ICAISC 2008. LNCS (LNAI), vol. 5097, pp. 11–16. Springer, Heidelberg (2008)
Bilski, J., Smola̧g, J.: Parallel realisation of the recurrent Elman neural network learning. In: Rutkowski, L., Scherer, R., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds.) ICAISC 2010, Part II. LNCS(LNAI), vol. 6114, pp. 19–25. Springer, Heidelberg (2010)
Bilski, J., Smola̧g, J.: Parallel realisation of the recurrent multi layer perceptron learning. In: Rutkowski, L., Korytkowski, M., Scherer, R., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds.) ICAISC 2012, Part I. LNCS(LNAI), vol. 7267, pp. 12–20. Springer, Heidelberg (2012)
Bilski, J., Smola̧g, J.: Parallel approach to learning of the recurrent Jordan neural network. In: Rutkowski, L., Korytkowski, M., Scherer, R., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds.) ICAISC 2013, Part I. LNCS(LNAI), vol. 7894, pp. 32–40. Springer, Heidelberg (2013)
Bilski, J.: Parallel Structures for Feedforward and Dynamical Neural Networks. AOW EXIT (2013). (in Polish)
Bilski, J., Smola̧g, J., Galushkin, A.I.: The parallel approach to the conjugate gradient learning algorithm for the feedforward neural networks. In: Rutkowski, L., Korytkowski, M., Scherer, R., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds.) ICAISC 2014, Part I. LNCS(LNAI), vol. 8467, pp. 12–21. Springer, Heidelberg (2014)
Bilski, J., Smola̧g, J.: Parallel architectures for learning the RTRN and Elman dynamic neural networks. IEEE Trans. Parallel Distrib. Syst. 26(9), 2561–2570 (2015)
Bilski, J., Smola̧g, J., Żurada, J.M.: Parallel approach to the Levenberg-Marquardt learning algorithm for feedforward neural networks. In: Rutkowski, L., Korytkowski, M., Scherer, R., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds.) Artificial Intelligence and Soft Computing. LNCS(LNAI), vol. 9119, pp. 3–14. Springer, Heidelberg (2015)
Chu, L.J., Krzyżak, A.: The recognition of partially occluded objects with support vector machines, convolutional neural networks and deep belief networks. J. Artif. Intell. Soft Comput. Res. 4(1), 5–19 (2014)
Cpałka, K., Łapa, K., Przybył, A., Zalasiński, M.: A new method for designing neuro-fuzzy systems for nonlinear modelling with interpretability aspects. Neurocomputing 135, 203–217 (2014)
Cpałka, K., Rebrova, O., Nowicki, R., Rutkowski, L.: On design of flexible neuro-fuzzy systems for nonlinear modelling. Int. J. Gen. Syst. 42(6), 706–720 (2013)
Cpałka, K., Zalasiński, M., Rutkowski, L.: New method for the on-line signature verification based on horizontal partitioning. Pattern Recognit. 47, 2652–2661 (2014)
Duch, W., Jankowski, N.: A survey of neural transfer functions. Neural Comput. Surv. 2, 163–213 (1999)
Fahlman, S.: Faster learning variations on back-propagation: an empirical study. In: Proceedings of Connectionist Models Summer School, Los Atos (1988)
Hagan, M.T., Menhaj, M.B.: Training feedforward networks with the Marquardt algorithm. IEEE Trans. Neural Netw. 5(6), 989–993 (1994)
Jankowski, N., Duch, W.: Optimal transfer function neural networks. In: Procedings of the 9th European Symposium on Artificial Neural Networks, Bruges, Belgium, pp. 101–106 (2001)
Kamruzzaman, J., Aziz, S.M.: A note on activation function in multilayer feedforward learning. In: Proceedings of International Joint Conference on Neural Networks: IJCNN 2002, vol. 1, pp. 519–523 (2002)
Kitajima, R., Kamimura, R.: Accumulative information enhancement in the self-organizing maps and its application to the analysis of mission statements. J. Artif. Intell. Soft Comput. Res. 5(3), 161–176 (2015)
Korytkowski, M., Rutkowski, L., Scherer, R.: Fast image classification by boosting fuzzy classifiers. Inf. Sci. 327, 175–182 (2016)
Łapa, K., Zalasiński, M., Cpałka, K.: A new method for designing and complexity reduction of neuro-fuzzy systems for nonlinear modelling. In: Rutkowski, L., Korytkowski, M., Scherer, R., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds.) ICAISC 2013, Part I. LNCS(LNAI), vol. 7894, pp. 329–344. Springer, Heidelberg (2013)
Riedmiller, M., Braun, H.: A direct method for faster backpropagation learning: the RPROP algorithm. In: IEEE International Conference on Neural Networks, San Francisco (1993)
Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning internal representations by error propagation. In: Rumelhart, D.E., McClelland, J.L. (eds.) Parallel Distributed Processing, vol. 1, chap. 8. The MIT Press, Cambridge (1986)
Rutkowski, L., Jaworski, M., Pietruczuk, L., Duda, P.: Decision trees for mining data streams based on the Gaussian approximation. IEEE Trans. Knowl. Data Eng. 26(1), 108–119 (2014)
Rutkowski, L., Jaworski, M., Pietruczuk, L., Duda, P.: A new method for data stream mining based on the misclassification error. IEEE Trans. Neural Netw. Learn. Syst. 26(5), 1048–1059 (2015)
Rutkowski, L., Pietruczuk, L., Duda, P., Jaworski, M.: Decision trees for mining data streams based on the McDiarmid’s bound. IEEE Trans. Knowl. Data Eng. 25(6), 1272–1279 (2013)
Rutkowski, L., Jaworski, M., Pietruczuk, L., Duda, P.: The CART decision trees for mining data streams. Inf. Sci. 266, 1–15 (2014)
Serdah, A.M., Ashour, W.M.: Clustering large-scale data based on modified affinity propagation algorithm. J. Artif. Intell. Soft Comput. Res. 6(1), 23–33 (2016)
Smola̧g, J., Bilski, J.: A systolic array for fast learning of neural networks. In: Proceedings of V Conference Neural Networks and Soft Computing, Zakopane, pp. 754–758 (2000)
Smola̧g, J., Rutkowski, L., Bilski, J.: Systolic array for neural networks. In: Proceedings of IV Conference Neural Networks and their Applications, Zakopane, pp. 487–497 (1999)
Starczewski, A.: A new validity index for crisp clusters. Pattern Anal. Appl. (2015). doi:10.1007/s10044-015-0525-8
Tadeusiewicz, R.: Neural Networks. AOW RM (1993). (in Polish)
Werbos, J.: Backpropagation through time: what it does and how to do it. Proc. IEEE 78(10), 1550–1560 (1990)
Wilamowski, B.M., Yo, H.: Neural network learning without backpropagation. IEEE Trans. Neural Netw. 21(11), 1793–1803 (2010)
Wilamowski, B.M., Yo, H.: Improved computation for Levenberg-Marquardt training. IEEE Trans. Neural Netw. 21(6), 930–937 (2010)
Yo, H., Reiner, P.D., Xie, T., Bartczak, T., Wilamowski, B.M.: An incremental design of radial basis function networks. IEEE Trans. Neural Netw. Learn. Syst. 25(10), 1793–1803 (2014)
Zalasiński, M., Łapa, K., Cpałka, K.: New algorithm for evolutionary selection of the dynamic signature global features. In: Rutkowski, L., Korytkowski, M., Scherer, R., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds.) ICAISC 2013, Part II. LNCS(LNAI), vol. 7895, pp. 113–121. Springer, Heidelberg (2013)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
Bilski, J., Galushkin, A.I. (2016). A New Proposition of the Activation Function for Significant Improvement of Neural Networks Performance. In: Rutkowski, L., Korytkowski, M., Scherer, R., Tadeusiewicz, R., Zadeh, L., Zurada, J. (eds) Artificial Intelligence and Soft Computing. ICAISC 2016. Lecture Notes in Computer Science(), vol 9692. Springer, Cham. https://doi.org/10.1007/978-3-319-39378-0_4
Download citation
DOI: https://doi.org/10.1007/978-3-319-39378-0_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-39377-3
Online ISBN: 978-3-319-39378-0
eBook Packages: Computer ScienceComputer Science (R0)