Curvature-Driven Smoothing in Backpropagation Neural Networks

CM Bishop

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

The standard back-propagation learning algorithm for feed-forward networks aims to minimize the mean square error defined over a set of training data. This form of error measure can lead to the problem of over-fitting in which the network stores individual data points from the training set, but fails to generalize satisfactorily for new data points. In this paper we propose a modified error measure which can reduce the tendency to over-fit and whose properties can be controlled by a single scalar parameter. The new error measure depends both on the function generated by the network and on its derivatives. A new learning algorithm is derived which can be used to minimize such error measures.
Original languageEnglish
Title of host publicationTheory and Applications of Neural Networks
Subtitle of host publicationProceedings of the First British Neural Network Society Meeting, London
EditorsJ.G. Taylor, C.L.T. Mannion
PublisherSpringer
Pages139-148
Number of pages10
ISBN (Electronic)978-1-4471-1833-6
ISBN (Print)978-3-540-19650-1
DOIs
Publication statusPublished - 1992

Publication series

NamePerspectives in Neural Computing
PublisherSpringer London
ISSN (Print)1431-6854

Fingerprint

Dive into the research topics of 'Curvature-Driven Smoothing in Backpropagation Neural Networks'. Together they form a unique fingerprint.

Cite this