Repository logo
 

An optimization approach to relate neural circuit architecture, loss landscapes and learning performance in static and dynamic tasks


Type

Thesis

Change log

Authors

Perez Rotondo, Adriana 

Abstract

Learning is challenging for large and complex neural circuits. There is a fundamental difficulty in determining how individual neurons or synapses affect the overall behavior of a circuit, which is known as the credit assignment problem. Rather than looking at single neurons or synapses then, one must step back and look at the overall circuit architecture and match the circuit’s structural patterns to its behavior.

One pattern of notable importance that we observe throughout the brain is dimensionality expansions, whereby a small population of neurons diverge onto a large group of neurons with synapses that convey the same information. As large neural circuits are energetically expensive, what is the purpose of these seemingly redundant synapses? Here, we show that this type of neural circuit expansion architecture affects the circuit’s ability to learn both static and dynamic tasks.

In static tasks, our findings show that adding redundant synapses to neural circuits can increase learning accuracy. We evaluate the impact of synaptic changes on learning performance, quantify the inherent challenges of learning with biologically plausible learning rules in large neural circuits, and establish the relationship between learning difficulty and neural circuit architecture. We link the geometry of the loss landscape to the difficulty of a task and demonstrate how network expansions modify the loss landscape.

For dynamic tasks, we consider the cerebellum, which is involved in motor control and has a unique architecture. Cerebellar mossy fibre inputs undergo a massive expansion into granule cells. Classical codon theory and more recent extensions argue that this architecture facilitates learning via pattern separation, however, this theory predicts that granule cell layer activity is sparse. Instead, recent physiological data indicate that the activity is denser than previously thought, underscoring a gap between cerebellar theory and data. Moreover, there is a conceptual gap between static pattern separation and the critical role of the cerebellum in dynamic tasks such as motor learning. We aim to fill both these gaps through mathematical analysis and simulations of cerebellar learning. We identify specific difficulties inherent to online learning of dynamic tasks, find that input expansions directly mitigate these difficulties, and show that this benefit is maximized when granule cell activity is dense.

Overall, this study illuminates how neural circuit architecture determines the ability to learn a task. Our analysis uncovers fundamental relationships between network architecture, learning performance, and geometry of loss landscapes, independent of specific learning rules. The findings suggest that seemingly redundant synapses in neural circuits may have a critical function in facilitating precise and fast motor learning.

Description

Date

2023-04-13

Advisors

O'Leary, Timothy

Keywords

cerebellum, computational neuroscience, learning, loss landscapes, motor control, neural circuits, online learning, optimization

Qualification

Doctor of Philosophy (PhD)

Awarding Institution

University of Cambridge
Sponsorship
European Research Council (716643)
ERC grant No. 716643 FLEXNEURO