2017-06-30 | High Dimensional Statistical Optimization

2017-06-30   

Abstract

Statistical optimization has received quite some interests recently. It refers to the case where hidden and local convexity can be discovered with large probability for nonconvex problems, making  polynomial algorithms possible. It relies on careful analysis of the geometry near global optima. In this talk,I will explore this direction by focusing on sparse regression problems in high dimensions. A computational framework named iterative local adaptive majorize-minimization(I-LAMM) is proposed to simultaneously control algorithmic complexity and statisticalerror.   I-LAMM effectively turns the nonconvex penalized regression problem into a series of convex programs by utilizing the locally strong convexity of the problem when restricting the solution set in an l1 cone.  Computationally, we establish a phase transition phenomenon:   it enjoys linear rate of convergence after a sub-linear burn-in.   Statistically,   it provides solutions with otimal statistical errors. Extensions to various models such as robust regression models and matrix models will be discussed.

Time

2017年6月30日(星期五) 10:00~11:45 

Speaker

Qiang Sun

University of Toronto


Qiang is currently an assistant professor at University of Toronto within the Department of Statistical Sciences there and holds a visiting appointment in the Department of Operations Research and Financial Engineering at Princeton University. He earned his doctoral degree in Biostatistics from the University of North Carolina, Chapel Hill. His research interests span a broad spectrum, including hypothesis-driven imaging genetics, statistical optimization for big data, nonasymptotic inference and robustness in high dimensions. He publishes papers in both statistical and scientific journals such as JASA, AoS, JRSSB and EST. 

Room

信息管理与工程学院308室

上海财经大学

上海市杨浦区武东路100号