Curvature-Driven Smoothing in Backpropagation Neural Networks
The standard backpropagation learning algorithm for feedforward networks aims to minimise the mean square error defined over a set of training data. This form of error measure can lead to the problem of over-fitting in which the network stores individual data points from the training set, but fails to generalise satisfactorily for new data points. In this paper we propose a modified error measure which can reduce the tendency to over-fit and whose properties can be controlled by a single scalar parameter. The new error measure depends both on the function generated by the network and on its derivatives. A new learning algorithm is derived which can be used to minimise such error measures.
Unable to display preview. Download preview PDF.
- Cater J P (1987) Successfully Using Peak Learning Rate of 10 (and greater) in Back- propagation Networks with the Heuristic Learning Algorithm. Proceedings of the IEEE First International Conference on Neural Networks. San Diego, CA Vol II 645–651.Google Scholar
- Dahl E D (1987) Accelerated Learning using the Generalised Delta Rule. Proceedings of the IEEE First International Conference on Neural Networks. San Diego, C A Vol II, 523–530.Google Scholar
- Korn G A, Korn T M (1968) Mathematical Handbook for Scientists and Engineers 2nd Ed 564.Google Scholar
- Rumelhart D E, McClelland J L (1986) Parallel Distributed Processing: Explorations in the Micro structure of Cognition Vol 1: Foundations. Cambridge, MA MIT Press.Google Scholar
- Stornetta W S, Huberman B A (1987) An Improved Three-Layer, Backpropagation Algorithm Proceedings of the IEEE First International Conference on Neural Networks. San Diego, CA Vol II, 637–643.Google Scholar
- Tikhonov A N, Arsenin V Y (1977) Solutions of III Posed Problems. New York, Wiley.Google Scholar