Repository logo
 

Stochastic Primal–Dual Hybrid Gradient Algorithm with Adaptive Step Sizes

Published version
Peer-reviewed

Repository DOI


Change log

Authors

Chambolle, A 
Delplancke, C 
Ehrhardt, MJ 
Schönlieb, CB 
Tang, J 

Abstract

jats:titleAbstract</jats:title>jats:pIn this work, we propose a new primal–dual algorithm with adaptive step sizes. The stochastic primal–dual hybrid gradient (SPDHG) algorithm with constant step sizes has become widely applied in large-scale convex optimization across many scientific fields due to its scalability. While the product of the primal and dual step sizes is subject to an upper-bound in order to ensure convergence, the selection of the ratio of the step sizes is critical in applications. Up-to-now there is no systematic and successful way of selecting the primal and dual step sizes for SPDHG. In this work, we propose a general class of adaptive SPDHG (A-SPDHG) algorithms and prove their convergence under weak assumptions. We also propose concrete parameters-updating strategies which satisfy the assumptions of our theory and thereby lead to convergent algorithms. Numerical examples on computed tomography demonstrate the effectiveness of the proposed schemes. </jats:p>

Description

Funder: Philip Leverhulme Prize


Funder: Royal Society Wolfson Fellowship


Funder: Alan Turing Institute; doi: http://dx.doi.org/10.13039/100012338

Keywords

4901 Applied Mathematics, 46 Information and Computing Sciences, 4903 Numerical and Computational Mathematics, 49 Mathematical Sciences, Bioengineering

Journal Title

Journal of Mathematical Imaging and Vision

Conference Name

Journal ISSN

0924-9907
1573-7683

Volume Title

66

Publisher

Springer Science and Business Media LLC
Sponsorship
Engineering and Physical Sciences Research Council (EP/S026045/1, EP/S026045/1, EP/V029428/1)
Leverhulme Trust (ECF-2019-478)
Wellcome Trust (215733/Z/19/Z)
Horizon 2020 Framework Programme (777826 NoMADS)