How to Escape Saddle Points Efficiently?

SpeakerDr. Ge
Organization Duke University
LocationEBIII, Room 2213
Start Date October 13, 2017 11:45 AM
End Date October 13, 2017 1:00 PM

Abstract The presence of saddle points is one of the key features of a non-convex objective function. In this talk, we will discuss why saddle points are ubiquitous in machine learning applications, and how to deal with these problems efficiently. We will show that for many natural problems, all local minima are also global, and the saddle points are strict. With such nice geometric properties, we show how a simple modification of gradient descent can optimize these problems very efficiently. Based on joint work with Chi Jin, Praneeth Netrapalli, Sham M. Kakade, Michael I. Jordan. Bio Rong Ge is an assistant professor in the computer science department at Duke University. He received his Ph.D. from Princeton University. He worked as a post-doctoral researcher at Microsoft Research New England. He is broadly interested in theoretical computer science and machine learning. His research focuses on designing algorithms with provable guarantees for machine learning problems, using techniques including tensor decompositions and non-convex optimization.

  October 2017
Sun Mon Tues Wed Thu Fri Sat