重庆国家应用数学中心 学院邮箱 English
首页学院概况党建思政师资队伍学科建设人才培养科学研究学生工作招生就业合作交流人才招聘
  学术报告
 合作办学 
 学术交流 
快速通道
 
相关链接
 
重师主页 科研系统 图书馆
教务系统 书记院长邮箱 OA系统
学术报告
当前位置: 首页 >> 旧栏目 >> 合作交流 >> 学术交流 >> 学术报告 >> 正文
学术报告——Andreas Themelis教授(日本九州大学)
2023-08-12 09:40     (点击: )


主讲人:Andreas Themelis  教授

邀请人:王宪福 教授

地点:汇贤楼best365体育官网登录入口326会议室

主办单位:best365体育官网登录入口


专家简介

日本九州大学教授,意大利IMT LUCCA大学计算机与决策系统科学专业和比利时KU LEUVEN 大学的电气工程专业双博士学位。SIAM J. Optim., Math. Program. IEEE Trans. Automat. Control优化领域的权威期刊上发表论文12篇。


时间:2023813日   9:00

腾讯会议号:445 811 226

报告名称:Globalized Newton-type algorithms for nonsmooth nonconvex structured optimization

报告摘要

"Splitting" algorithms, such as proximal gradient and ADMM, have proven advantageous for tackling complex problems by breaking them into manageable subtasks. However, due to their first-order nature, they often encounter challenges with ill conditioning, leading to frustratingly slow convergence rates that hinder their practicality. Several attempts have been proposed to address this issue, but most are either problem specific, offer only local convergence guarantees, or compromise the algorithms' simplicity.This talk offers a tutorial on the employment of "proximal envelopes" to provide a solution that does not suffer from any of the aforementioned limitations. The tutorial focuses on the proximal gradient algorithm, but the methodology easily extends to any other splitting method that possesses an "envelope" function, such as the ADMM and the Douglas-Rachford splitting.


时间:2023815   9:00

腾讯会议号:173 542 225

报告名称:Adaptive proximal algorithms for convex optimization under local Lipschitz continuity of the gradient

报告摘要

Gradient-based proximal algorithms have traditionally been bound to global Lipschitz differentiability requirements. Attempts to widen their applicability or reduce conservatism typically involve wasteful trial-and-error backtracking routines. Extending recent advancements in the smooth setting, we show how for convex problems it is possible to avoid backtrackings altogether and retrieve stepsizes adaptively without function evaluations. We demonstrate this with an adaptive primal-dual three-term splitting method that includes proximal gradient as special case. Finally, these findings are extended to the even wider class of (simple) bilevel programs.


关闭窗口

版权所有:best365体育官网登录入口 - 365wm完美体育官网  地址:重庆市沙坪坝区大学城中路37号 汇贤楼
网站:www.xinsenhj.com  邮编:401331

Baidu
sogou