Repository logo
 

Neural ODE Processes

Accepted version
Peer-reviewed

Type

Article

Change log

Authors

Norcliffe, Alexander 
Bodnar, Cristian 
Day, Ben 
Moss, Jacob 
Liò, Pietro 

Abstract

Neural Ordinary Differential Equations (NODEs) use a neural network to model the instantaneous rate of change in the state of a system. However, despite their apparent suitability for dynamics-governed time-series, NODEs present a few disadvantages. First, they are unable to adapt to incoming data points, a fundamental requirement for real-time applications imposed by the natural direction of time. Second, time series are often composed of a sparse set of measurements that could be explained by many possible underlying dynamics. NODEs do not capture this uncertainty. In contrast, Neural Processes (NPs) are a family of models providing uncertainty estimation and fast data adaptation but lack an explicit treatment of the flow of time. To address these problems, we introduce Neural ODE Processes (NDPs), a new class of stochastic processes determined by a distribution over Neural ODEs. By maintaining an adaptive data-dependent distribution over the underlying ODE, we show that our model can successfully capture the dynamics of low-dimensional systems from just a few data points. At the same time, we demonstrate that NDPs scale up to challenging high-dimensional time-series with unknown latent dynamics such as rotating MNIST digits.

Description

Keywords

cs.LG, cs.LG

Journal Title

CoRR

Conference Name

The Ninth International Conference on Learning Representations

Journal ISSN

Volume Title

Publisher

Publisher DOI

Rights

All rights reserved