您当前的位置: 首页 信息中心 教工公告 正文

教工公告

2024年6月1日徐东坡教授学术讲座

来源:新葡萄8883官网AMG 浏览人数: 发布时间:2024-05-28

新葡萄8883官网AMG2024第 19期 数理大讲堂

主讲人:徐东坡  教授   

主持人:张乃敏  教授

讲座时间:2024年6月1日上午09:00-10:00

讲座地点:新葡萄8883官网AMG3B205

讲座题目:Stochastic momentum methods for non-convex learning without bounded assumptions

摘要:Stochastic momentum methods are widely used to solve stochastic optimization problems in machine learning. However, most of the existing theoretical analyses rely on either bounded assumptions or strong stepsize conditions. In this talk, we focus on a class of non-convex objective functions satisfying the Polyak–Łojasiewicz (PL) condition and present a unified convergence rate analysis for stochastic momentum methods without any bounded assumptions, which covers stochastic heavy ball (SHB) and stochastic Nesterov accelerated gradient (SNAG). Our analysis achieves the more challenging last-iterate convergence rate of function values under the relaxed growth (RG) condition, which is

a weaker assumption than those used in related work. Specifically, we attain the sub-linear rate for stochastic momentum methods with diminishing stepsizes, and the linear convergence rate for constant stepsizes if the strong growth (SG) condition holds. We also examine the iteration complexity for obtaining an ϵ-accurate solution of the last-iterate.

个人简介:

徐东坡,现任东北师范大学数学与统计学院教授、博士生导师。2009年博士毕业于大连理工大学计算数学专业。研究方向为:深度学习的优化理论及其应用。作为第一作者曾经在《IEEE Trans. Neural Netw. Learn. Syst》、《IEEE Trans. Signal Process》、《Neural Networks》、《Neural Computation》以及《Neurocomputing》等杂志发表多篇SCI论文。2022年获得吉林省科学技术奖自然科学奖二等奖,独立完成; 2017年获得吉林省自然科学学术成果奖二等奖,排名第一。 目前主持国家自然科学基金面上项目一项,作为课题骨干参加了两项国家重点研发计划项目,担任Frontiers in Artificial Intelligence和Frontiers in Big Data期刊的副主编。

联系我们

地址:新葡萄8883官网AMG南校区3号楼 电话:00-86-0577-86689098 传真:00-86-0577-86689528 邮编:325035 邮箱:slxy@wzu.edu.cn

关注我们

版权所有

新葡萄8883·(AMG认证)官方网站-Delight the World 浙ICP备07006821号-1 技术支持:捷点科技