Radu-Alexandru Dragomir

- Researcher in numerical optimization and data science -

About me


Since 2024, I am an assistant professor in the S2A team at Télécom Paris. I work on optimization methods for machine learning and data science.

Prior to that, I held two post-doctoral positions, at UCLouvain with Yurii Nesterov and then at EPFL in the OPTIM group of Nicolas Boumal.

From 2018 to 2021, I did my PhD jointly with Jérôme Bolte and Alexandre d'Aspremont within the SIERRA team in Paris.

Email: dragomir [at] telecom-paris.fr


Research

I study optimization methods for solving large-scale problems arising in signal processing and data science. My research revolves around understanding problems with non-quadratic and non-convex geometries. Among others, I am interested in:

  • Nonlinear inverse problems
  • Mirror descent
  • Quartic polynomials
  • Riemannian optimization
  • Matrix factorization
  • Computer-aided performance estimation

Preprints

  • R-A. Dragomir, Y. Nesterov. Convex Quartic Problems: Homogenized Gradient Method and Preconditioning. [arxiv] [slides]

Publications

  • R-A. Dragomir, M. Even, H. Hendrikx. Fast Stochastic Bregman Gradient Methods: Sharp Analysis and Variance Reduction.
    International Conference on Machine Learning, 2021. [PMLR] [arxiv] [slides]
  • R-A. Dragomir, A. B. Taylor, A. d'Aspremont, J. Bolte. Optimal Complexity and Certification of Bregman First-Order Methods.
    Mathematical Programming, 2021. [springer] [arxiv] [GeoGebra demo] [code]
  • R-A. Dragomir, A. d'Aspremont, J. Bolte. Quartic First-Order Methods for Low-Rank Minimization.
    Journal of Optimization Theory and Applications, 2021. [springer] [arxiv] [code]

Thesis

  • R-A. Dragomir, Bregman Gradient Methods for Relatively-Smooth Optimization.
    PhD thesis, 2021. Advised by Jérôme Bolte and Alexandre d'Aspremont. [pdf] [slides] [video]