Stochastic Primal–Dual Hybrid Gradient Algorithm with Adaptive Step Sizes
Published version
Peer-reviewed
Repository URI
Repository DOI
Change log
Authors
Abstract
jats:titleAbstract</jats:title>jats:pIn this work, we propose a new primal–dual algorithm with adaptive step sizes. The stochastic primal–dual hybrid gradient (SPDHG) algorithm with constant step sizes has become widely applied in large-scale convex optimization across many scientific fields due to its scalability. While the product of the primal and dual step sizes is subject to an upper-bound in order to ensure convergence, the selection of the ratio of the step sizes is critical in applications. Up-to-now there is no systematic and successful way of selecting the primal and dual step sizes for SPDHG. In this work, we propose a general class of adaptive SPDHG (A-SPDHG) algorithms and prove their convergence under weak assumptions. We also propose concrete parameters-updating strategies which satisfy the assumptions of our theory and thereby lead to convergent algorithms. Numerical examples on computed tomography demonstrate the effectiveness of the proposed schemes. </jats:p>
Description
Funder: Philip Leverhulme Prize
Funder: Royal Society Wolfson Fellowship
Funder: Alan Turing Institute; doi: http://dx.doi.org/10.13039/100012338
Keywords
Journal Title
Conference Name
Journal ISSN
1573-7683
Volume Title
Publisher
Publisher DOI
Sponsorship
Leverhulme Trust (ECF-2019-478)
Wellcome Trust (215733/Z/19/Z)
Horizon 2020 Framework Programme (777826 NoMADS)