Show simple item record

dc.contributor.authorBurkhart, Michael
dc.date.accessioned2022-05-23T23:30:30Z
dc.date.available2022-05-23T23:30:30Z
dc.identifier.issn1862-4472
dc.identifier.urihttps://www.repository.cam.ac.uk/handle/1810/337413
dc.description.abstractTo minimize the average of a set of log-convex functions, the stochastic Newton method iteratively updates its estimate using subsampled versions of the full objective's gradient and Hessian. We contextualize this optimization problem as sequential Bayesian inference on a latent state-space model with a discriminatively-specified observation process. Applying Bayesian filtering then yields a novel optimization algorithm that considers the entire history of gradients and Hessians when forming an update. We establish matrix-based conditions under which the effect of older observations diminishes over time, in a manner analogous to Polyak's heavy ball momentum. We illustrate various aspects of our approach with an example and review other relevant innovations for the stochastic Newton method.
dc.publisherSpringer
dc.rightsPublisher's own licence
dc.titleDiscriminative Bayesian Filtering Lends Momentum to the Stochastic Newton Method for Minimizing Log-Convex Functions
dc.typeArticle
dc.publisher.departmentDepartment of Psychology
dc.date.updated2022-05-21T08:48:34Z
prism.publicationNameOptimization Letters
dc.identifier.doi10.17863/CAM.84825
dcterms.dateAccepted2022-05-19
rioxxterms.versionAM
dc.contributor.orcidBurkhart, Michael [0000-0002-2772-5840]
rioxxterms.typeJournal Article/Review
cam.orpheus.counter10*
cam.depositDate2022-05-21
pubs.licence-identifierapollo-deposit-licence-2-1
pubs.licence-display-nameApollo Repository Deposit Licence Agreement
rioxxterms.freetoread.startdate2025-05-23


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record