Towards a Theory of Non-Log-Concave Sampling: First-Order Stationarity Guarantees for Langevin Monte Carlo
CONFERENCE ON LEARNING THEORY, VOL 178(2022)
摘要
For the task of sampling from a density pi proportional to exp(-V) on R-d, where V is possibly non-convex but L-gradient Lipschitz, we prove that averaged Langevin Monte Carlo outputs a sample with epsilon-relative Fisher information after O(L(2)d(2)/epsilon(2)) iterations. This is the sampling analogue of complexity bounds for finding an epsilon-approximate first-order stationary points in non-convex optimization and therefore constitutes a first step towards the general theory of non-log-concave sampling. We discuss numerous extensions and applications of our result; in particular, it yields a new state-ofthe-art guarantee for sampling from distributions which satisfy a Poincar ' e inequality.
更多查看译文
关键词
Fisher information,Langevin Monte Carlo,non-log-concave sampling,Poincare inequality
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
数据免责声明
页面数据均来自互联网公开来源、合作出版商和通过AI技术自动分析结果,我们不对页面数据的有效性、准确性、正确性、可靠性、完整性和及时性做出任何承诺和保证。若有疑问,可以通过电子邮件方式联系我们:report@aminer.cn