December 25, 2019

146 words 1 min read

How to escape saddle points efficiently

How to escape saddle points efficiently

Many new theoretical challenges have arisen in the area of gradient-based optimization for large-scale data analysis, driven by the needs of applications and the opportunities provided by new hardware and software platforms. Michael Jordan shares recent research on the avoidance of saddle points in high-dimensional nonconvex optimization.

Talk Title How to escape saddle points efficiently
Speakers Michael Jordan (UC Berkeley)
Conference Artificial Intelligence Conference
Conf Tag Put AI to Work
Location San Francisco, California
Date September 18-20, 2017
URL Talk Page
Slides
Video Talk Video

Many new theoretical challenges have arisen in the area of gradient-based optimization for large-scale data analysis, driven by the needs of applications and the opportunities provided by new hardware and software platforms. Drawing on work undertaken with Chi Jin, Rong Ge, Praneeth Netrapalli, and Sham Kakade, Michael Jordan shares recent research on the avoidance of saddle points in high-dimensional nonconvex optimization.

comments powered by Disqus