Stochastic Subgradient Descent Escapes Active Strict Saddles on Weakly Convex Functions

MATHEMATICS OF OPERATIONS RESEARCH(2024)

引用 0|浏览8
摘要
In nonsmooth stochastic optimization, we establish the nonconvergence of the stochastic subgradient descent (SGD) to the critical points recently called active strict saddles by Davis and Drusvyatskiy. Such points lie on a manifold M, where the function f has a direction of second-order negative curvature. Off this manifold, the norm of the Clarke subdifferential of f is lower-bounded. We require two conditions on f. The first assumption is a Verdier stratification condition, which is a refinement of the popular Whitney stratification. It allows us to establish a strengthened version of the projection formula of Bolte et al. for Whitney stratifiable functions and which is of independent interest. The second assumption, termed the angle condition, allows us to control the distance of the iterates to M. When f is weakly convex, our assumptions are generic. Consequently, generically, in the class of definable weakly convex functions, SGD converges to a local minimizer. Funding: The work of Sholom Schechtman was supported by “Région Ile-de-France”.
更多
查看译文
关键词
stochastic
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
0
您的评分 :

暂无评分

数据免责声明
页面数据均来自互联网公开来源、合作出版商和通过AI技术自动分析结果,我们不对页面数据的有效性、准确性、正确性、可靠性、完整性和及时性做出任何承诺和保证。若有疑问,可以通过电子邮件方式联系我们:report@aminer.cn