Repository logo
 

Stochastic Primal-Dual Hybrid Gradient Algorithm with Adaptive Step Sizes.

Published version
Peer-reviewed

Repository DOI


Change log

Abstract

In this work, we propose a new primal-dual algorithm with adaptive step sizes. The stochastic primal-dual hybrid gradient (SPDHG) algorithm with constant step sizes has become widely applied in large-scale convex optimization across many scientific fields due to its scalability. While the product of the primal and dual step sizes is subject to an upper-bound in order to ensure convergence, the selection of the ratio of the step sizes is critical in applications. Up-to-now there is no systematic and successful way of selecting the primal and dual step sizes for SPDHG. In this work, we propose a general class of adaptive SPDHG (A-SPDHG) algorithms and prove their convergence under weak assumptions. We also propose concrete parameters-updating strategies which satisfy the assumptions of our theory and thereby lead to convergent algorithms. Numerical examples on computed tomography demonstrate the effectiveness of the proposed schemes.

Description

Funder: Philip Leverhulme Prize


Funder: Royal Society Wolfson Fellowship


Funder: Alan Turing Institute; doi: http://dx.doi.org/10.13039/100012338

Journal Title

J Math Imaging Vis

Conference Name

Journal ISSN

0924-9907
1573-7683

Volume Title

66

Publisher

Springer Nature

Rights and licensing

Except where otherwised noted, this item's license is described as http://creativecommons.org/licenses/by/4.0/
Sponsorship
Engineering and Physical Sciences Research Council (EP/S026045/1, EP/S026045/1, EP/V029428/1)
Leverhulme Trust (ECF-2019-478)
Wellcome Trust (215733/Z/19/Z)
Horizon 2020 Framework Programme (777826 NoMADS)