•  学术报告

关于举行张进副教授(南方科技大学)学术报告的通知

发布时间:2023-03-08文章来源:华南理工大学数学学院浏览次数:472

报告题目Towards Gradient-based Bilevel Optimization in Machine Learning

报 : 张进副教授

报告时间2023年3月12日(星期日)15:40-16:25              

报告地点: 四号楼4318会议室

邀 : 潘少华

欢迎广大师生前往!

数学学院

2023年3月8

 

报告摘要:Recently, Bi-Level Optimization (BLO) techniques have received extensive attentions from machine learning communities. In this talk, we will discuss some recent advances in the applications of BLO. First, we study a gradient-based bi-level optimization method for learning tasks with convex lower level. In particular, by formulating bi-level models from the optimistic viewpoint and aggregating hierarchical objective information, we establish Bi-level Descent Aggregation (BDA), a flexible and modularized algorithmic framework for bi-level programming. Second, we focus on a variety of BLO models in complex and practical tasks are of non-convex follower structure in nature. In particular, we propose a new algorithmic framework, named Initialization Auxiliary and Pessimistic Trajectory Truncated Gradient Method (IAPTT-GM), to partially address the lower level non-convexity. By introducing an auxiliary as initialization to guide the optimization dynamics and designing a pessimistic trajectory truncation operation, we construct a reliable approximation to the original BLO in the absence of lower level convexity hypothesis. Extensive experiments justify our theoretical results and demonstrate the superiority of the proposed BDA and IAPTT-GM for different tasks, including hyper-parameter optimization and meta learning.

 

报告人简介南方科技大学数学系副教授,深圳国家应用数学中心协理副主任,国家级优秀青年、广东省杰青、深圳市优青。2007、2010年本科、硕士毕业于大连理工大学,2014年博士毕业于加拿大维多利亚大学。2015至2018年间任职香港浸会大学,2019年初加入南方科技大学。致力于最优化理论和应用研究,代表性成果发表在Math Program、SIAM J Optim、Math Oper Res、SIAM J Numer Anal、J Mach Learn Res、IEEE Trans Pattern Anal Mach Intell,以及ICML、NeurIPS等有重要影响力的最优化、计算数学、机器学习期刊与会议上。研究成果获得2020年中国运筹学会青年科技奖、2022年广东省青年科技创新奖,主持国家自然科学基金/广东省自然科学基金/深圳市科创委/香港研究资助局面上项目。