Repository logo

Amortised MAP Inference for Image Super-Resolution

Published version


Conference Object

Change log


Huszar, Ferenc 
Sønderby, Casper Kaae 
Caballer, Jose 
Theis, Lucas 
Shi, Wenzhe 


Image super-resolution (SR) is an underdetermined inverse problem, where a large number of plausible high resolution images can explain the same downsampled image. Most current single image SR methods use empirical risk minimisation, often with a pixel-wise mean squared error (MSE) loss. However, the outputs from such methods tend to be blurry, over-smoothed and generally appear implausible. A more desirable approach would employ Maximum a Posteriori (MAP) infer- ence, preferring solutions that always have a high probability under the image prior, and thus appear more plausible. Direct MAP estimation for SR is non- trivial, as it requires us to build a model for the image prior from samples. Here we introduce new methods for amortised MAP inference whereby we calculate the MAP estimate directly using a convolutional neural network. We first introduce a novel neural network architecture that performs a projection to the affine subspace of valid SR solutions ensuring that the high resolution output of the network is always consistent with the low resolution input. Using this architecture, the amor- tised MAP inference problem reduces to minimising the cross-entropy between two distributions, similar to training generative models. We propose three methods to solve this optimisation problem: (1) Generative Adversarial Networks (GAN) (2) denoiser-guided SR which backpropagates gradient-estimates from denoising to train the network, and (3) a baseline method using a maximum-likelihood- trained image prior. Our experiments show that the GAN based approach per- forms best on real image data. Lastly, we establish a connection between GANs and amortised variational inference as in e. g. variational autoencoders.



Journal Title

Conference Name

International Conference on Learning Representations, ICLR 2017

Journal ISSN

Volume Title



All rights reserved