Accelerating Descent Methods: A Dynamical Systems Perspective

4/25/19 | 4:15pm | E51-325
Reception to follow.


 

 

 

 

Ashia Wilson

Postdoctoral Researcher
Microsoft Research 


Abstract: The connection between continuous-time dynamics and discrete-time algorithms has led to the introduction of several methods in optimization. We add to this body of work by introducing a family of descent dynamics and descent algorithms with matching non-asymptotic convergence guarantees. This framework recovers many standard results in optimization. In addition, we provide a description of several general frameworks for accelerating descent algorithms when the objective function is convex. We use these framework to analyze a simple first-order algorithm called rescaled gradient descent (RGD), and show RGD achieves much faster convergence rate guarantees than gradient descent when the function is sufficiently smooth. Throughout we provide several examples and numerical demonstrations.

Bio: Ashia Wilson is a postdoctoral candidate at Microsoft Research, New England. She received her doctorate in Statistics from the UC Berkeley, with undergraduate degrees from Harvard. Her work focuses on designing and analyzing algorithms for inference by utilizing dynamical systems and various tools from Physics. At Berkeley, she was advised by Michael Jordan and Benjamin Recht. She has been the recipient of several fellowships including the Berkeley Chancellors Fellowship and the NSF Graduate Research Fellowship.

Event Time: 

2019 - 16:15