Perspective: New insights from loss function landscapes of neural networks
Publication Date
2020Journal Title
Machine Learning: Science and Technology
ISSN
2632-2153
Publisher
IOP Publishing
Volume
1
Issue
2
Language
en
Type
Article
This Version
VoR
Metadata
Show full item recordCitation
Chitturi, S., Verpoort, P., Lee, A., & Wales, D. (2020). Perspective: New insights from loss function landscapes of neural networks. Machine Learning: Science and Technology, 1 (2) https://doi.org/10.1088/2632-2153/ab7aef
Abstract
<jats:title>Abstract</jats:title>
<jats:p>We investigate the structure of the loss function landscape for neural networks subject to dataset mislabelling, increased training set diversity, and reduced node connectivity, using various techniques developed for energy landscape exploration. The benchmarking models are classification problems for atomic geometry optimisation and hand-written digit prediction. We consider the effect of varying the size of the atomic configuration space used to generate initial geometries and find that the number of stationary points increases rapidly with the size of the training configuration space. We introduce a measure of node locality to limit network connectivity and perturb permutational weight symmetry, and examine how this parameter affects the resulting landscapes. We find that highly-reduced systems have low capacity and exhibit landscapes with very few minima. On the other hand, small amounts of reduced connectivity can enhance network expressibility and can yield more complex landscapes. Investigating the effect of deliberate classification errors in the training data, we find that the variance in testing AUC, computed over a sample of minima, grows significantly with the training error, providing new insight into the role of the variance-bias trade-off when training under noise. Finally, we illustrate how the number of local minima for networks with two and three hidden layers, but a comparable number of variable edge weights, increases significantly with the number of layers, and as the number of training data decreases. This work helps shed further light on neural network loss landscapes and provides guidance for future work on neural network training and optimisation.</jats:p>
Keywords
Perspective, machine learning landscape, mislabelling, node locality
Sponsorship
epsrc
Funder references
Engineering and Physical Sciences Research Council (EP/N035003/1)
Identifiers
mlstab7aef, ab7aef, mlst-100027.r1
External DOI: https://doi.org/10.1088/2632-2153/ab7aef
This record's URL: https://www.repository.cam.ac.uk/handle/1810/333062
Rights
Licence:
http://creativecommons.org/licenses/by/4.0/
Statistics
Total file downloads (since January 2020). For more information on metrics see the
IRUS guide.
Recommended or similar items
The current recommendation prototype on the Apollo Repository will be turned off on 03 February 2023. Although the pilot has been fruitful for both parties, the service provider IKVA is focusing on horizon scanning products and so the recommender service can no longer be supported. We recognise the importance of recommender services in supporting research discovery and are evaluating offerings from other service providers. If you would like to offer feedback on this decision please contact us on: support@repository.cam.ac.uk