一、报告题目:Unified Convergence Analysis of Stochastic Bregman Proximal Gradient and Extragradient Methods
二、报告人:大连理工大学肖现涛副教授
三、报告时间:2022年1月7日星期五下午16:30
四、报告平台:腾讯会议 会议ID:794-327-872
五、摘要:In this talk, we consider a mini-batch stochastic Bregman proximal gradient method and a mini-batch stochastic Bregman proximal extragradient method for stochastic convex composite optimization problems. A simplified and unified convergence analysis framework is proposed to obtain almost sure convergence properties and expected convergence rates of the mini-batch stochastic Bregman proximal gradient method and its variants. This framework can also be used to analyze the convergence of the mini-batch stochastic Bregman proximal extragradient method, which has seldom been discussed in the literature. We point out that the standard uniformly bounded variance assumption and the usual Lipschitz gradient continuity assumption are not required in the analysis.
六、报告人简介:
肖现涛,大连理工大学副教授、博士生导师。研究方向为: 求解复合优化和随机优化的算法。在SIAM Journal on Optimization, Mathematics of Operations Research, Journal of Scientific Computing, Science China Mathematics等期刊发表学术论文40余篇,主持国家自然科学基金2项。