Research

My research interests are on nonlinear optimization and machine learning. I look at exploiting the two fundamental structures of least-squares and symmetry in optimization problems. A specific focus has been on developing efficient numerical algorithms for large-scale problems with rank and orthogonality constraints, which are ubiquitous in machine learning applications.

Research cloud

Below are some specific projects that I have been actively working on with my collaborators.

Research at Amazon.com

  • Competitive pricing of products. We estimate the competitive prices of 3p unique products, i.e., Amazon is not a retail player for these products. The large-scale nature of the project demands to work on simple algorithmic implementations that are readily scalable. Currently, we are generating price estimates for all of the Amazon marketplaces. (Patent​ filed.)
  • Demand forecasting. We study low-rank tensor factorization algorithms in demand forecasting problems. This is especially critical for capturing “global” trends of data.

Research themes

  • Metric tuning on Riemannian manifolds. Understanding the (Riemannian) geometry of structured constraints is of particular interest in machine learning. Conceptually, it allows to translate a constrained optimization problem into an unconstrained optimization problem on a nonlinear search space (manifold). Building upon this point of view, one research interest is to understand the role of (Riemannian) metrics or inner products to develop better-conditioned optimization algorithms. This idea broadly leads to the notion of Riemannian preconditioning.
    We have been actively involved in both matrix and tensor applications. Specific papers include [arXiv:1211.1550][arXiv:1306.2672][arXiv:1405.6055][arXiv:1605.08257].
  • Decenetralized and stochastic optimization algorithms. We explore recent advances in stochastic gradient algorithms on manifolds. Specific papers include [arXiv:1605.07367][arXiv:1603.04989].
    We exploit consensus learning on manifolds in the context of large-scale distributed algorithms on problems like matrix factorization and multitask learning. An initial work is in [arXiv:1605.06968][arXiv:1705.00467].
  • Low-rank optimization with structure. We develop efficient algorithms for problems in big data systems by exploiting low-rank and sparse decomposition. Papers include [arXiv:1604.04325][arXiv:1607.07252].
    The work [arXiv:1704.07352] proposes a generic framework for tackling low-rank optimization with constraints by exploiting a variational characterization of nuclear norm.
  • Deep learning. We study geometric algorithms for modern deep networks that are robust to invariances of the parameters. An initial work is in [arXiv:1511.01754]. Currently, we are exploring the duality between  “complex architecture and simple algorithms” and “simple architecture and complex algorithms”.
  • Manopt. I am also involved in the development and promotion of the Matlab toolbox Manopt for optimization on manifolds.
Example