<?xml version="1.0" encoding="UTF-8"?><xml><records><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>47</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">Reza Godaz</style></author><author><style face="normal" font="default" size="100%">Benyamin Ghojogh</style></author><author><style face="normal" font="default" size="100%">Reshad Hosseini</style></author><author><style face="normal" font="default" size="100%">Reza Monsefi</style></author><author><style face="normal" font="default" size="100%">Fakhri Karray</style></author><author><style face="normal" font="default" size="100%">Mark Crowley</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">Vector Transport Free Riemannian LBFGS for Optimization on Symmetric Positive Definite Matrix Manifolds</style></title><secondary-title><style face="normal" font="default" size="100%">Asian Conference on Machine Learning (ACML)</style></secondary-title></titles><dates><year><style  face="normal" font="default" size="100%">Accepted</style></year><pub-dates><date><style  face="normal" font="default" size="100%">November</style></date></pub-dates></dates><urls><web-urls><url><style face="normal" font="default" size="100%">http://www.acml-conf.org/2021/conference/accepted-papers/81/</style></url></web-urls></urls><pub-location><style face="normal" font="default" size="100%">Virtual</style></pub-location><pages><style face="normal" font="default" size="100%">8</style></pages><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">This work concentrates on optimization on Riemannian manifolds. The Limited-memory Broyden–Fletcher–Goldfarb–Shanno (LBFGS) algorithm is a commonly used quasi-Newton method for numerical optimization in Euclidean spaces. Riemannian LBFGS (RLBFGS) is an extension of this method to Riemannian manifolds. RLBFGS involves computationally expensive vector transports as well as unfolding recursions using adjoint vector transports. In this article, we propose two mappings in the tangent space using the inverse second root and Cholesky decomposition. These mappings make both vector transport and adjoint vector transport identity and therefore isometric. Identity vector transport makes RLBFGS less computationally expensive and its isometry is also very useful in convergence analysis of RLBFGS. Moreover, under the proposed mappings, the Riemannian metric reduces to Euclidean inner product, which is much less computationally expensive. We focus on the Symmetric Positive Definite (SPD) manifolds which are beneficial in various fields such as data science and statistics. This work opens a research opportunity for extension of the proposed mappings to other well-known manifolds.</style></abstract></record></records></xml>