Single-frame Regularization for Temporally Stable CNNs
View / Open Files
Publication Date
2019Journal Title
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019)
ISSN
1063-6919
Pages
11168-11177
Type
Conference Object
This Version
AM
Metadata
Show full item recordCitation
Eilertsen, G., Mantiuk, R., Unger, J., & Soc, I. C. (2019). Single-frame Regularization for Temporally Stable CNNs. 2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 11168-11177. https://doi.org/10.1109/CVPR.2019.01143
Abstract
Convolutional neural networks (CNNs) can model complicated non-linear
relations between images. However, they are notoriously sensitive to small
changes in the input. Most CNNs trained to describe image-to-image mappings
generate temporally unstable results when applied to video sequences, leading
to flickering artifacts and other inconsistencies over time. In order to use
CNNs for video material, previous methods have relied on estimating dense
frame-to-frame motion information (optical flow) in the training and/or the
inference phase, or by exploring recurrent learning structures. We take a
different approach to the problem, posing temporal stability as a
regularization of the cost function. The regularization is formulated to
account for different types of motion that can occur between frames, so that
temporally stable CNNs can be trained without the need for video material or
expensive motion estimation. The training can be performed as a fine-tuning
operation, without architectural modifications of the CNN. Our evaluation shows
that the training strategy leads to large improvements in temporal smoothness.
Moreover, for small datasets the regularization can help in boosting the
generalization performance to a much larger extent than what is possible with
na\"ive augmentation strategies.
Sponsorship
European Commission Horizon 2020 (H2020) ERC (725253)
Identifiers
External DOI: https://doi.org/10.1109/CVPR.2019.01143
This record's URL: https://www.repository.cam.ac.uk/handle/1810/300518
Rights
All rights reserved