Abstract
Neural networks require a careful design in order to perform properly on a given task. In particular, selecting a good activation function (possibly in a data-dependent fashion) is a crucial step, which remains an open problem in the research community. Despite a large amount of investigations, most current implementations simply select one fixed function from a small set of candidates, which is not adapted during training, and is shared among all neurons throughout the different layers. However, neither two of these assumptions can be supposed optimal in practice. In this paper, we present a principled way to have data-dependent adaptation of the activation functions, which is performed independently for each neuron. This is achieved by leveraging over past and present advances on cubic spline interpolation, allowing for local adaptation of the functions around their regions of use. The resulting algorithm is relatively cheap to implement, and overfitting is counterbalanced by the inclusion of a novel damping criterion, which penalizes unwanted oscillations from a predefined shape. Preliminary experimental results validate the proposal.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
We note that the following treatment can be extended easily to the case of a network with more than one hidden layer. However, restricting it to a single layer allow us to keep the discussion focused on the problems/advantages arising in the use of SAFs. We leave this extension to a future work.
- 2.
- 3.
- 4.
References
Agostinelli, F., Hoffman, M., Sadowski, P., Baldi, P.: Learning activation functions to improve deep neural networks (2014). arXiv preprint arXiv:1412.6830
Chandra, P., Singh, Y.: An activation function adapting training algorithm for sigmoidal feedforward networks. Neurocomputing 61, 429–437 (2004)
Chen, C.T., Chang, W.D.: A feedforward neural network with function shape autotuning. Neural Netw. 9(4), 627–641 (1996)
Glorot, X., Bengio, Y.: Understanding the difficulty of training deep feedforward neural networks. In: International Conference on Artificial Intelligence and Statistics, pp. 249–256 (2010)
Goh, S., Mandic, D.: Recurrent neural networks with trainable amplitude of activation functions. Neural Netw. 16(8), 1095–1100 (2003)
Guarnieri, S., Piazza, F., Uncini, A.: Multilayer feedforward networks with adaptive spline activation function. IEEE Trans. Neural Netw. 10(3), 672–683 (1999)
Haykin, S.: Neural Networks and Learning Machines, 3rd edn. Pearson Education (2009)
Kingma, D., Ba, J.: Adam: A method for stochastic optimization. In: 3rd International Conference for Learning Representations (2015). arXiv preprint arXiv:1412.6980
Lin, M., Chen, Q., Yan, S.: Network in Network (2013). arXiv preprint arXiv:1312.4400
Ma, L., Khorasani, K.: Constructive feedforward neural networks using hermite polynomial activation functions. IEEE Trans. Neural Netw. 16(4), 821–833 (2005)
Piazza, F., Uncini, A., Zenobi, M.: Artificial neural networks with adaptive polynomial activation function. In: International Joint Conference on Neural Networks, vol. 2, pp. II–343. IEEE/INNS (1992)
Rasmussen, C.: Gaussian Processes for Machine Learning. MIT Press (2006)
Scarpiniti, M., Comminiello, D., Parisi, R., Uncini, A.: Nonlinear spline adaptive filtering. Signal Process. 93(4), 772–783 (2013)
Scarpiniti, M., Comminiello, D., Scarano, G., Parisi, R., Uncini, A.: Steady-state performance of spline adaptive filters. IEEE Trans. Signal Process. 64(4), 816–828 (2016)
Schmidhuber, J.: Deep learning in neural networks: an overview. Neural Netw. 61, 85–117 (2015)
Trentin, E.: Networks with trainable amplitude of activation functions. Neural Netw. 14(4), 471–493 (2001)
Vecci, L., Piazza, F., Uncini, A.: Learning and approximation capabilities of adaptive spline activation function neural networks. Neural Netw. 11(2), 259–270 (1998)
Wahba, G.: Spline Models for Observational Data. SIAM (1990)
Zhang, M., Xu, S., Fulcher, J.: Neuron-adaptive higher order neural-network models for automated financial data modeling. IEEE Trans. Neural Netw. 13(1), 188–204 (2002)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer International Publishing AG, part of Springer Nature
About this chapter
Cite this chapter
Scardapane, S., Scarpiniti, M., Comminiello, D., Uncini, A. (2019). Learning Activation Functions from Data Using Cubic Spline Interpolation. In: Esposito, A., Faundez-Zanuy, M., Morabito, F., Pasero, E. (eds) Neural Advances in Processing Nonlinear Dynamic Signals. WIRN 2017 2017. Smart Innovation, Systems and Technologies, vol 102. Springer, Cham. https://doi.org/10.1007/978-3-319-95098-3_7
Download citation
DOI: https://doi.org/10.1007/978-3-319-95098-3_7
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-95097-6
Online ISBN: 978-3-319-95098-3
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)