Publications

The list of publications is also available on Google Scholar and Semantic Scholar.

PhD thesis

BM, “A Riemannian approach to large-scale constrained least-squares with symmetries”, PhD thesis, University of Liège, 2014 [Orbi:173257] [Thesis presentation]. The PhD jury included Lieven De Lathauwer, Quentin Louveaux, Louis Wehenkel, Philippe Toint, Francis Bach, and Rodolphe Sepulchre.

Patents

A patent filed on computing competitive reference price of products.

A patent filed on a two-dimensional distributed computing approach for matrix completion through gossip.

Preprints

M. Bhutani and BM, “A two-dimensional decomposition approach for matrix completion through gossip”, arXiv preprint arXiv:1711.07684 2017 [arXiv:1711.07684]. A shorter version got accepted to the NIPS workshop on Emergent Communication, 2017.

H. Kasai and BM, “Riemannian joint dimensionality reduction and dictionary learning on symmetric positive definite manifold”, submitted 2017.

S. Mahadevan, S. Ghosh, and BM, “Domain adaptation using geometric mean metric learning on manifolds”, submitted 2017.

M. Nimishakavi, P. Jawanpuria, and BM, “A dual framework for low-rank tensor completion”, submitted 2017 [arXiv:1712.01193][Matlab codes page]. A shorter version got accepted to the NIPS workshop on Synergies in Geometric Data Analysis, 2017.

P. Jawanpuria and BM, “A saddle point approach to structured low-rank matrix learning”, arXiv preprint arXiv:1704.07352, 2017 [arXiv:1704.07352][Matlab codes page]. A shorter version got accepted to 10th NIPS Workshop on Optimization for Machine Learning, 2017.

H. Kasai, H. Sato, and BM, “Riemannian stochastic quasi-Newton algorithm with variance reduction and its convergence analysis”, arXiv preprint arXiv:1703.04890, 2017 [arXiv:1703.04890].

BM, H. Kasai, P. Jawanpuria, and A. Saroop, “A Riemannian gossip approach to decentralized subspace learning on Grassmann manifold.”, Technical report, 2017 [arXiv:1705.00467][Matlab codes]. It is an extension of the technical report [arXiv:1605.06968]. A shorter version was earlier accepted to 9th NIPS workshop on optimization for machine learning (OPT2016) held at Barcelona.

H. Sato, H. Kasai, and BM, “Riemannian stochastic variance reduced gradient”, arXiv preprint arXiv:1702.05594, 2017 [ arXiv:1702.05594]. A shorter version is “Riemannian stochastic variance reduced gradient on Grassmann manifold”, Technical report, 2016 [ arXiv:1605.07367] [Matlab codes]. A still shorter version has been accepted to 9th NIPS workshop on optimization for machine learning (OPT2016) to be held at Barcelona.

V. Badrinarayanan, BM, and R. Cipolla, “Symmetry-invariant optimization in deep networks”, Technical report, arXiv:arXiv:1511.01754, 2015 [arXiv:1511.01754] [Matlab code webpage]. A shorter version appears in “Understanding symmetries in deep networks”, 8th NIPS workshop on optimization for machine learning (OPT2015) [arXiv:1511.01029].

BM, K. Adithya Apuroop, and R. Sepulchre, “A Riemannian geometry for low-rank matrix completion”, Technical report, 2012 [arXiv:1211.1550] [qGeomMC Matlab code webpage].

Journal articles

Y. Shi, BM, and W. Chen, “Topological interference management with user admission control via Riemannian optimization”, accepted to IEEE Transactions on Wireless Communications, 2017 [arXiv:1607.07252][Publisher’s pdf].

BM and R. Sepulchre, “Riemannian preconditioning”, SIAM Journal on optimization, 26(1), pp. 635 – 660, 2016 [arXiv:1405.6055] [Publisher’s pdf].

Y. Sun, J. Gao, X. Hong, BM, and B. Yin, “Heterogeneous tensor decomposition for clustering via manifold optimization”, Transactions on Pattern Analysis and Machine Intelligence, 38(3) pp. 476 – 489, 2016 [arXiv:1504.01777] [Publisher’s pdf].

N. Boumal, BM, P.-A. Absil, and R. Sepulchre, “Manopt: a Matlab toolbox for optimization on manifolds”, Journal of Machine Learning Research 15(Apr), pp. 1455 – 1459, 2014 [Publisher’s pdf] [arXiv:1308.5200] [Webpage].

BM, G. Meyer, S. Bonnabel, and R. Sepulchre, “Fixed-rank matrix factorizations and Riemannian low-rank optimization”, Computational Statistics 29(3 – 4), pp. 591 – 621, 2014 [Publisher’s pdf] [arXiv:1209.0430] [Matlab codes supplied with the qGeomMC package].

BM, G. Meyer, F. Bach, and R. Sepulchre, “Low-rank optimization with trace norm penalty”, SIAM Journal on Optimization 23 (4), pp. 2124 – 2149, 2013 [Publisher’s pdf] [arXiv:1112.2318] [Matlab code webpage].

Conference proceedings

Y. Shi, BM, and Wei Chen, “A sparse and low-rank optimization framework for network topology control in dense fog-RAN,” accepted to the IEEE 85th Vehicular Technology Conference, 2017. The earlier title was “A sparse and low-rank optimization framework for index coding via Riemannian optimization”, Technical report, 2016 [arXiv:1604.04325] [Matlab codes]. A shorter version of the work presenting a unified approach, titled “Sparse and low-rank decomposition for big data systems via smoothed Riemannian optimization,” also appeared in the 9th NIPS workshop on optimization for machine learning (OPT2016) held in​ Barcelona.

BM and R. Sepulchre, “Scaled stochastic gradient descent for low-rank matrix completion”, Accepted to the 55th IEEE Conference on Decision and Control, 2016 [Publisher’s copy] [ arXiv:1603.04989] [Matlab codes]. A different version is accepted to the internal Amazon machine learning conference (AMLC) 2016. AMLC is a platform for internal Amazon researchers to present their work.

H. Kasai and BM, “Low-rank tensor completion: a Riemannian manifold preconditioning approach”, accepted to ICML 2016 [Publisher’s copy] [Supplementary material] [arXiv:1605.08257]. A shorter version is at 8th NIPS workshop on optimization for machine learning (OPT2015). Earlier title “Riemannian preconditioning for tensor completion”, arXiv:1506.02159, 2015 [arXiv:1506.02159] [Matlab code webpage].

R. Liégeois, BM, M. Zorzi, and R. Sepulchre, “Sparse plus low-rank autoregressive identification in neuroimaging time series”, Accepted for publication in the proceedings of the 54th IEEE Conference on Decision and Control, 2015 [Publisher’s pdf] [arXiv:1503.08639] [Matlab code webpage].

BM and R. Sepulchre, “R3MC: A Riemannian three-factor algorithm for low-rank matrix completion”, Proceedings of the 53rd IEEE Conference on Decision and Control, pp. 1137 – 1142, 2014 [arXiv:1306.2672] [R3MC Matlab code webpage] [Publisher’s pdf].

BM and B. Vandereycken, “A Riemannian approach to low-rank algebraic Riccati equations”, Proceedings of the 21st International Symposium on Mathematical Theory of Networks and Systems, Extended abstract, pp. 965 – 968, 2014 [Publisher’s pdf][arXiv:1312.4883] [Matlab code webpage].

BM, G. Meyer, and R. Sepulchre, “Low-rank optimization for distance matrix completion”, Proceedings of the 50th IEEE Conference on Decision and Control, pp. 4455 – 4460, 2011 [Publisher’s pdf] [arXiv:1304.6663] [Matlab code webpage].

Seminars, workshops, and symposiums

Gave a seminar in the numerical analysis group at the University of Geneva on 07 October 2016.

Gave a lecture talk at IFCAM summer school on large-scale​ optimization at IISc on 07 July 2016. The talk was focussed on manifold optimization: applications and algorithms.

On 30 September 2015, gave a seminar talk on Manopt: a Matlab toolbox for optimization on manifolds in the department of systems and control engineering at IIT Bombay. Gave a second seminar talk on the same topic at the School of Electrical Sciences, IIT Bhubaneswar on 12 October 2015.

In Workshop on low-rank optimization and applications, Hausdorff Center for Mathematics, Bonn, June 2015, presented Riemannian preconditioning.

On 04 June 2015, gave a seminar talk on a Riemannian approach to large-scale constrained least-squares with symmetries at UCLouvain, Belgium as part of the big data seminar series.

In TCMM Workshop 2014, together with Nicolas Boumal, presented the Manopt toolbox.

In Dolomites Workshop 2013, presented the work on “Fixed-rank optimization on Riemannian quotient manifolds”.

In Benelux meeting on systems and control 2013, presented the work on “Tuning metrics for low-rank matrix completion” [Book of Abstracts: Page 157].

In ISMP 2012, presented the work on “Fixed-rank matrix factorizations and the design of invariant optimization algorithms” [Book of Abstracts: Page 145].

In Benelux meeting on systems and control 2012, presented the work on “Low-rank optimization on the set of low-rank non-symmetric matrices” [Book of Abstracts: Page 103].

In Benelux meeting on systems and control 2011, presented the work on “Manifold based optimization techniques for low-rank distance matrix completion” [Book of Abstracts: Page 75].

Posters

Presented three posters on Structured low-rank learning, a dual framework for tensor completion, and a two-dimensional gossip approach to matrix completion at NIPS workshops, 2017.

Presented three posters on RSVRG, gossip for matrix completion, and low-rank and sparse decomposition at the optimization for machine learning workshop (OPT2016) at NIPS, Barcelona, 2016.

Along with Hiroyuki Kasai, presented poster on low-rank tensor completion: a Riemannian preconditioning approach at ICML 2016.

Scaled SGD for matrix completion at internal Amazon machine learning conference, Seattle, 2016.

Two posters at the NIPS workshop on optimization for machine learning, 2015. One on the tensor completion and the other on symmetry-invariant optimization in deep networks.

Riemannian preconditioning for tensor completion in Low-rank Optimization and Applications workshop, Hausdorff Center for Mathematics, Bonn, 2015.

Riemannian preconditioning in IAP DySCO study day, 2014.

Trace norm minimization in IAP DySCO study day, 2012.

Miscellaneous

I maintain a list of fixed-rank algorithms and geometries here. The list is work in progress…