题 目:分裂算法快速收敛率研究及其在数据科学中的应用
演 讲 人:张进,南方科技大学助理教授
主 持 人:林贵华,十大正规网投官网平台(中国)有限公司教授
时 间:2019年7月2日(周二),下午2:30-3:30
地 点:校本部东区十大正规网投官网平台420室
主办单位:十大正规网投官网平台(中国)有限公司、十大正规网投官网平台(中国)有限公司青年教师联谊会
演讲人简介:
张进,南方科技大学数学系助理教授。2007年于大连理工大学人文社会科学学院获文学学士,2010年于大连理工大学数学科学学院获理学硕士学位,2014年12月于加拿大维多利亚大学数学与统计系获应用数学博士学位。2015年4月至2019年1月就职香港浸会大学。主要从事最优化及其应用领域的研究,在 Mathematical Programming、SIAM Journal on Optimization、SIAM journal on Numerical Analysis、European Journal of Operational Research等发表论文20余篇。
演讲内容简介:
Despite the rich literature, the linear convergence of alternating direction method of multipliers (ADMM) has not been fully understood even for the convex case. For example, the linear convergence of ADMM can be empirically observed in a wide range of applications, while existing theoretical results seem to be too stringent to be satisfied or too ambiguous to be checked and thus why the ADMM performs linear convergence for these applications still seems to be unclear. In this paper, we systematically study the linear convergence of ADMM in the context of convex optimization through the lens of variaitonal analysis. We show that the linear convergence of ADMM can be guaranteed without the strong convexity of objective functions together with the full rank assumption of the coefficient matrices, or the full polyhedricity assumption of their subdifferential; and it is possible to discern the linear convergence for various concrete applications, especially for some representative models arising in statistical learning. We use some variational analysis techniques sophisticatedly; and our analysis is conducted in the most general proximal version of ADMM with Fortin and Glowinski's larger step size so that all major variants of the ADMM known in the literature are covered.
欢迎广大师生参加!