Repository logo
 

Langevin Dynamics with Continuous Tempering for High-dimensional Non-convex Optimization.

Accepted version
Peer-reviewed

Type

Conference Object

Change log

Authors

Ye, Nanyang 
Zhu, Zhanxing 
Mantiuk, Rafal K 

Abstract

Minimizing non-convex and high-dimensional objective functions is challenging, especially when training modern deep neural networks. In this paper, a novel approach is proposed which divides the training process into two consecutive phases to obtain better generalization performance: Bayesian sampling and stochastic optimization. The first phase is to explore the energy landscape and to capture the"temperature dynamics''. These strategies can overcome the challenge of early trapping into bad local minima and have achieved remarkable improvements in various types of neural networks as shown in our theoretical analysis and empirical experiments.

Description

Keywords

Journal Title

CoRR

Conference Name

Journal ISSN

Volume Title

abs/1703.04379

Publisher