2017-07-25 | Compete in Price or Service?—A Study of Personalized Pricing and Money Back Guarantees
Statistical optimization has received quite some interests recently. It refers to the case where hidden and local convexity can be discovered with large probability for nonconvex problems, making polynomial algorithms possible. It relies on careful analysis of the geometry near global optima. In this talk,I will explore this direction by focusing on sparse regression problems in high dimensions. A computational framework named iterative local adaptive majorize-minimization(I-LAMM) is proposed to simultaneously control algorithmic complexity and statisticalerror. I-LAMM eﬀectively turns the nonconvex penalized regression problem into a series of convex programs by utilizing the locally strong convexity of the problem when restricting the solution set in an l1 cone. Computationally, we establish a phase transition phenomenon: it enjoys linear rate of convergence after a sub-linear burn-in. Statistically, it provides solutions with otimal statistical errors. Extensions to various models such as robust regression models and matrix models will be discussed.
University of Delaware
Professor CHEN Bintong received the prestige “Chang Jiang Scholar” award from the Ministry of Education in 2008 and “Thousand Expert Plan” national award in 2010.