2017-12-05 | Jiayi Guo:Smooth quasi-Newton methods for nonsmooth optimization

2017-12-05   

Abstract

Sporadic informal observations over several decades (and most recently in Lewis-Overton, 2013) suggest that quasi-Newton methods for smooth optimization can also work surprisingly well on nonsmooth functions. This talk explores this phenomenon from several perspectives. First, we compare experimentally the two most popular quasi-Newton updates, BFGS and SR1, in the nonsmooth setting. Secondly, we study how repeated BFGS updating at a single fixed point can serve as a separation oracle (for the subdifferential). Lastly, we show how Powell's original 1976 BFGS convergence proof for smooth convex functions in fact extends to some nonsmooth settings.


Time

2017年12月5日(周二) 14:00


Speaker

Jiayi Guo is a Ph.D. student in School of Operations Research and Information Engineering at Cornell University, under the supervision of Professor Adrian Lewis in the same department. He is expecting to graduate on May 2018. Broadly conceived, his research area is optimization. Currently, his work explores different variations of iterative methods to solve continuous optimization problems on non-smooth functions. In general, Jiayi is interested in the interplay between optimization, simulation, and numerical analysis.


Before coming to Cornell, he received his B.S. of Mathematics and B.S. of Computer Science dual degree (2012) at University of Illinois at Urbana-Champaign.


Location

信息管理与工程学院 308会议室

上海财经大学

上海市杨浦区武东路100号