The information theory of developmental pruning: Optimizing global network architectures using local synaptic rules.
Publication Date
2021-10Journal Title
PLoS Comput Biol
ISSN
1553-734X
Publisher
Public Library of Science (PLoS)
Volume
17
Issue
10
Language
en
Type
Article
This Version
VoR
Metadata
Show full item recordCitation
Scholl, C., Rule, M. E., & Hennig, M. H. (2021). The information theory of developmental pruning: Optimizing global network architectures using local synaptic rules.. PLoS Comput Biol, 17 (10) https://doi.org/10.1371/journal.pcbi.1009458
Description
Funder: Studienstiftung des Deutschen Volkes; funder-id: http://dx.doi.org/10.13039/501100004350
Funder: Bundesministerium für Bildung und Forschung; funder-id: http://dx.doi.org/10.13039/501100002347
Funder: Max-Planck-Gesellschaft; funder-id: http://dx.doi.org/10.13039/501100004189
Abstract
During development, biological neural networks produce more synapses and neurons than needed. Many of these synapses and neurons are later removed in a process known as neural pruning. Why networks should initially be over-populated, and the processes that determine which synapses and neurons are ultimately pruned, remains unclear. We study the mechanisms and significance of neural pruning in model neural networks. In a deep Boltzmann machine model of sensory encoding, we find that (1) synaptic pruning is necessary to learn efficient network architectures that retain computationally-relevant connections, (2) pruning by synaptic weight alone does not optimize network size and (3) pruning based on a locally-available measure of importance based on Fisher information allows the network to identify structurally important vs. unimportant connections and neurons. This locally-available measure of importance has a biological interpretation in terms of the correlations between presynaptic and postsynaptic neurons, and implies an efficient activity-driven pruning rule. Overall, we show how local activity-dependent synaptic pruning can solve the global problem of optimizing a network architecture. We relate these findings to biology as follows: (I) Synaptic over-production is necessary for activity-dependent connectivity optimization. (II) In networks that have more neurons than needed, cells compete for activity, and only the most important and selective neurons are retained. (III) Cells may also be pruned due to a loss of synapses on their axons. This occurs when the information they convey is not relevant to the target population.
Keywords
Algorithms, Animals, Computational Biology, Humans, Information Theory, Models, Neurological, Nerve Net, Neural Networks, Computer, Neurons, Synapses
Sponsorship
Engineering and Physical Sciences Research Council (EP/L027208/1)
Identifiers
pcompbiol-d-20-02149
External DOI: https://doi.org/10.1371/journal.pcbi.1009458
This record's URL: https://www.repository.cam.ac.uk/handle/1810/330615
Rights
Licence:
http://creativecommons.org/licenses/by/4.0/
Statistics
Total file downloads (since January 2020). For more information on metrics see the
IRUS guide.
Recommended or similar items
The current recommendation prototype on the Apollo Repository will be turned off on 03 February 2023. Although the pilot has been fruitful for both parties, the service provider IKVA is focusing on horizon scanning products and so the recommender service can no longer be supported. We recognise the importance of recommender services in supporting research discovery and are evaluating offerings from other service providers. If you would like to offer feedback on this decision please contact us on: support@repository.cam.ac.uk