Abstract
We introduce a family of Log-Determinant (Log-Det) divergences on the set of symmetric, positive definite (SPD) matrices that includes the Log-Euclidean distance as a special case. This is then generalized to a family of Log-Det divergences on the set of positive definite Hilbert–Schmidt operators on a Hilbert space, with the Log-Hilbert–Schmidt distance being a special case. The divergences introduced here are novel both in the finite and infinite-dimensional settings. We also generalize the Power Euclidean distances to the infinite-dimensional setting, which we call the Extended Power Hilbert–Schmidt distances. While these families include the Log-Euclidean and Log-Hilbert–Schmidt distances, respectively, as special cases, they do not satisfy the same invariances as the latter distances, in contrast to the Log-Det divergences. In the case of RKHS covariance operators, we provide closed form formulas for all of the above divergences and distances in terms of the corresponding Gram matrices.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Arsigny, V., Fillard, P., Pennec, X., Ayache, N.: Geometric means in a novel vector space structure on symmetric positive-definite matrices. SIAM J. Matrix Anal. Appl. 29(1), 328–347 (2007)
Bhatia, R.: Positive Definite Matrices. Princeton University Press, Princeton (2007)
Chebbi, Z., Moakher, M.: Means of Hermitian positive-definite matrices based on the log-determinant \(\alpha \)-divergence function. Linear Algebr. Appl. 436(7), 1872–1889 (2012)
Cichocki, A., Cruces, S., Amari, S.: Log-determinant divergences revisited: alpha-beta and gamma log-det divergences. Entropy 17(5), 2988–3034 (2015)
Dryden, I., Koloydenko, A., Zhou, D.: Non-Euclidean statistics for covariance matrices, with applications to diffusion tensor imaging. Ann. Appl. Stat. 3, 1102–1123 (2009)
Larotonda, G.: Nonpositive curvature: a geometrical approach to Hilbert–Schmidt operators. Differ. Geom. Appl. 25, 679–700 (2007)
Minh, H.: Infinite-dimensional log-determinant divergences II: alpha-beta divergences. arXiv:1610.08087v2 (2016)
Minh, H.: Infinite-dimensional log-determinant divergences between positive definite trace class operators. Linear Algebr. Appl. 528, 331–383 (2017)
Minh, H., San Biagio, M., Murino, V.: Log-Hilbert–Schmidt metric between positive definite operators on Hilbert spaces. In: Advances in Neural Information Processing Systems (NIPS), pp. 388–396 (2014)
Minh, H.Q.: Affine-invariant Riemannian distance between infinite-dimensional covariance operators. Geometric Science of Information, pp. 30–38 (2015)
Minh, H.Q.: Log-determinant divergences between positive definite Hilbert–Schmidt operators. Geometric Science of Information (2017)
Pennec, X., Fillard, P., Ayache, N.: A Riemannian framework for tensor computing. Int. J. Comput. Vis. 66(1), 41–66 (2006)
Sra, S.: A new metric on the manifold of kernel matrices with application to matrix geometric means. In: Advances in Neural Information Processing Systems (NIPS), pp. 144–152 (2012)
Steinwart, I., Christmann, A.: Support Vector Machines. Springer Science & Business Media, Berlin (2008)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Minh, H.Q. (2018). Infinite-Dimensional Log-Determinant Divergences III: Log-Euclidean and Log-Hilbert–Schmidt Divergences. In: Ay, N., Gibilisco, P., Matúš, F. (eds) Information Geometry and Its Applications . IGAIA IV 2016. Springer Proceedings in Mathematics & Statistics, vol 252. Springer, Cham. https://doi.org/10.1007/978-3-319-97798-0_8
Download citation
DOI: https://doi.org/10.1007/978-3-319-97798-0_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-97797-3
Online ISBN: 978-3-319-97798-0
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)