Repository logo
 

Fundamental bounds on learning performance in neural circuits.

Published version
Peer-reviewed

Type

Article

Change log

Authors

Raman, Dhruva Venkita 
Rotondo, Adriana Perez 

Abstract

How does the size of a neural circuit influence its learning performance? Larger brains tend to be found in species with higher cognitive function and learning ability. Intuitively, we expect the learning capacity of a neural circuit to grow with the number of neurons and synapses. We show how adding apparently redundant neurons and connections to a network can make a task more learnable. Consequently, large neural circuits can either devote connectivity to generating complex behaviors or exploit this connectivity to achieve faster and more precise learning of simpler behaviors. However, we show that in a biologically relevant setting where synapses introduce an unavoidable amount of noise, there is an optimal size of network for a given task. Above the optimal network size, the addition of neurons and synaptic connections starts to impede learning performance. This suggests that the size of brain circuits may be constrained by the need to learn efficiently with unreliable synapses and provides a hypothesis for why some neurological learning deficits are associated with hyperconnectivity. Our analysis is independent of specific learning rules and uncovers fundamental relationships between learning rate, task performance, network size, and intrinsic noise in neural circuits.

Description

Keywords

artificial intelligence, learning, neural network, optimization, synaptic plasticity

Journal Title

Proceedings of the National Academy of Sciences of the United States of America

Conference Name

Journal ISSN

1091-6490
1091-6490

Volume Title

116

Publisher

National Academy of Sciences
Sponsorship
European Research Council (716643)
This work is supported by European Research Council Grant StG2016-FLEXNEURO (716643).