Abstract
The choice of the most appropriate activation functions for artificial neural networks has a significant effect on the training time and task performance. Nowadays the most widely-used activation function is the Rectified Linear Unit (ReLU). Despite its “dying ReLU problem” and many attempts to replace it with something better, it is still considered as default choice to begin with creation of network. Two years ago a new, promising function has been described formulated by Google Brain Team. The proposed function - named Swish - was obtained using a combination of exhaustive and reinforcement learning-based search. According to the authors, simply replacing ReLUs with Swish units improves top-1 classification accuracy on ImageNet by 0.9% for Mobile NASNet-A and 0.6% for Inception-ResNet-v2. This paper describes an experiment on CIFAR-10 image set where Swish appears not to outperform ReLU.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
LeCun, Y., Bottou, L., Orr, G., Müller, K.: Efficient BackProp. In: Montavon, G., Orr, G.B., Müller, K.R. (eds.) Neural Networks: Tricks of the Trade. Lecture Notes in Computer Science, vol 7700. Springer, Heidelberg (1998)
Hahnloser, R., Sarpeshkar, R., Mahowald, M.A., Douglas, R.J., Seung, H.S.: Digital selection and analogue amplification coexist in a cortex-inspired silicon circuit. Nature 405, 947–951 (2000)
Glorot, X., Bordes, A., Bengio, Y.: Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, PMLR 15, pp. 315–323 (2011)
Tóth, L.: Phone recognition with deep sparse rectifier neural networks. In: 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, Vancouver, BC, pp. 6985–6989 (2013)
Nair, V., Hinton, G.: Rectified linear units improve restricted boltzmann machines. In: Fürnkranz, J., Joachims, T. (eds.) Proceedings of the 27th International Conference on International Conference on Machine Learning (ICML 2010), pp. 807–814. Omnipress, USA (2010)
Maas, A.: Rectifier nonlinearities improve neural network acoustic models (2013)
Ramachandran, P., Zoph, B., Quoc, V.: Searching for activation functions, CoRR (2017)
Barret, Z.: Swish: a self-gated activation function (2017)
Mortazi, A., Bagci, U.: Automatically Designing CNN Architectures for Medical Image Segmentation: 9th International Workshop, MLMI, Held in Conjunction with MICCAI 2018, Granada, Spain, Proceedings (2018). https://doi.org/10.1007/978-3-030-00919-9_12
Hornik, K.: Approximation capabilities of multilayer feedforward networks. Neural Networks 4(2), 251–257 (1991). 0893-6080
Sharma, J.: Experiments with SWISH activation function on MNIST dataset (2016)
Han, J., Morag C.: The influence of the sigmoid function parameters on the speed of backpropagation learning (1995)
Szyc, K.: Comparison of different deep-learning methods for image classification. In: 2018 IEEE 22nd International Conference on Intelligent Engineering Systems (INES). IEEE (2018)
Sharma, J.: Swish in depth: a comparison of Swish & ReLU on CIFAR-10 (2017)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Szandała, T. (2020). Benchmarking Comparison of Swish vs. Other Activation Functions on CIFAR-10 Imageset. In: Zamojski, W., Mazurkiewicz, J., Sugier, J., Walkowiak, T., Kacprzyk, J. (eds) Engineering in Dependability of Computer Systems and Networks. DepCoS-RELCOMEX 2019. Advances in Intelligent Systems and Computing, vol 987. Springer, Cham. https://doi.org/10.1007/978-3-030-19501-4_49
Download citation
DOI: https://doi.org/10.1007/978-3-030-19501-4_49
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-19500-7
Online ISBN: 978-3-030-19501-4
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)