Other - Colloquium on Artificial Intelligence Research and Optimization | |
Inexact Proximalstochastic Gradient Method for Empirical Risk Minimization | |
Hongchao Zhang, Louisiana State University | |
Professor | |
Virtual- REGISTRATION REQUIRED (SEE ABSTRACT) Zoom February 03, 2021 - 01:00 pm |
|
Abstract: We will talk about algorithm frameworks of inexact proximalstochastic gradient method for solving empirical composite optimization, whose objective function is a summation of an average of a large number of smooth convex or nonconvexfunctions and a convex, but possibly nonsmooth, function. At each iteration, the algorithm inexactly solves a proximalsubproblemconstructed by using a stochastic gradient of the objective function. Variance reduction techniques are incorporated in the method to reduce the stochastic gradient variance. The main feature of these algorithms is to allow solving the proximal subproblemsinexactly while still keeping the global convergence with desirable complexity bounds. Global convergence and the component gradient complexity bounds are derived for the cases when the objective function is strongly convex, convex or nonconvex. Some preliminary numerical experiments indicate the efficiency of the algorithm.
REGISTRATION IS FREE AND REQUIRED TO RECIEVE ZOOM ID: |
|
Speaker's Bio:
Hongchao Zhang received his PhD in applied mathematics from University of Florida in 2006. He then had a postdoc position at the Institute for Mathematics and Its Applications (IMA) and IBM T.J. Watson Research Center. He joined LSU as an assistant professor in 2008 and is now a professor in the department of mathematics and Center for Computation & Technology (CCT) at LSU. His research interests are nonlinear optimization theory, algorithm and applications |
|