Skip to main content

Event Details

  • Monday, November 6, 2017
  • 14:30 - 15:00

From Statics to Dynamics and from Convexity to Nonconvexity: the Scaling Limit of Iterative Algorithms for High-Dimensional Inference

This talk presents our recent work on analyzing, in the high-dimensional limit, the exact dynamics of iterative algorithms for solving nonconvex optimization problems that arise in signal estimation. For concreteness, we will focus on the prototypical problem of sparse principal component analysis in the talk. We show that the time-varying empirical measures of the estimates given by the algorithms will converge weakly to a deterministic “limiting process” in the high-dimensional limit. Moreover, this limiting process can be characterized as the unique solution of a nonlinear PDE, and it provides exact information regarding the asymptotic performance of the algorithms. For example, performance metrics such as the MSE, the cosine similarity and the misclassification rate in sparse support recovery can all be obtained by examining the deterministic limiting process. A steady-state analysis of the nonlinear PDE also reveals interesting phase transition phenomena related to the performance of the algorithm. Although our analysis is asymptotic in nature, numerical simulations show that the theoretical predictions are accurate for moderate signal dimensions. We will conclude the talk by briefly presenting the application of similar analysis techniques to other nonconvex optimization problems such as phase retrieval, subspace tracking, low-rank matrix estimation, and dictionary learning. Joint work with Chuang Wang (Harvard University) and Jonathan Mattingly (Duke University)