当前位置: 首页 >> 科学研究 >> 学术交流 >> 正文

理学院学术报告-Stochastic alternating structure-adapted proximal gradient descent method with variance reduction for nonconvex nonsmooth optimization

发布者: [发表时间]:2023-06-05 [来源]: [浏览次数]:

报告题目:Stochastic alternating structure-adapted proximal gradient descent method with variance reduction for nonconvex nonsmooth optimization

报告时间:2023年6月5日(周一)下午15:00-17:00

报告地点:南一120


主讲人:韩德仁 教授

摘要:We develop a stochastic alternating structure-adapted proximal (s-ASAP) gradient descent method for solving the block optimization problems.  By deploying some state-of-the-art variance reduced gradient estimators (rather than full gradient) in stochastic optimization, the s-ASAP method is applicable to nonconvex consensus optimization problems whose objectives are the sum of a finite number of Lipschitz continuous functions. The sublinear convergence rate of s-ASAP method is built upon the proximal point theory. Furthermore, the linear convergence rate of s-ASAP method can be attainable under some mild conditions on objectives, e.g., the error bound and the Kurdyka-Lojasiewicz (KL) property. Preliminary numerical simulations on some applications in image processing demonstrate the compelling performance of the proposed method.


Baidu
map