Abstract
In this paper we discuss existing and new connections between latent variable models from machine learning and tensors (multi-way arrays) from multilinear algebra. A few ideas have been developed independently in the two communities. However, there are still many useful but unexplored links and ideas that could be borrowed from one of the communities and used in the other. We will start our discussion from simple concepts such as independent variables and rank-1 matrices and gradually increase the difficulty. The final goal is to connect discrete latent tree graphical models to state of the art tensor decompositions in order to find tractable representations of probability tables of many variables.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
The values of \(P_i(j)\) are actually in the interval [0, 1] but for the purposes of this paper it is easier to think of them as real numbers.
References
Carroll, J., Chang, J.: Analysis of individual differences in multidimensional scaling via an N-way generalization of “Eckart-Young” decomposition. Psychometrika 35(3), 283–319 (1970)
Cichocki, A.: Tensor networks for big data analytics and large-scale optimization problems (2014). arXiv preprint arXiv:1407.3124
Cichocki, A., Mandic, D., De Lathauwer, L., Zhou, G., Zhao, Q., Caiafa, C., Phan, H.A.: Tensor decompositions for signal processing applications: from two-way to multiway component analysis. IEEE Sig. Process. Mag. 32(2), 145–163 (2015)
Cichocki, A., Zdunek, R., Phan, A., Amari, S.: Nonnegative Matrix and Tensor Factorizations. Wiley, Chichester (2009)
Comon, P.: Tensors: a brief introduction. IEEE Signal Process. Mag. 31(3), 44–53 (2014)
Domanov, I., De Lathauwer, L.: On the uniqueness of the canonical polyadic decomposition of third-order tensors – part II: Uniqueness of the overall decomposition. SIAM J. Matrix Anal. Appl. 34(3), 876–903 (2013)
Grasedyck, L.: Hierarchical singular value decomposition of tensors. SIAM J. Matrix Anal. Appl. 31(4), 2029–2054 (2010)
Grasedyck, L., Kressner, D., Tobler, C.: A literature survey of low-rank tensor approximation techniques. GAMM Mitt. 36(1), 53–78 (2013)
Hackbusch, W.: Tensor Spaces and Numerical Tensor Calculus. Springer Series in Computational Mathematics. Springer, Heidelberg (2012). vol. 42
Harshman, R.A.: Foundations of the PARAFAC procedure: model and conditions for an “explanatory” multi-mode factor analysis. UCLA Working Pap. Phonetics 16(1), 1–84 (1970)
Ishteva, M., Song, L., Park, H.: Unfolding latent tree structures using \(4\)th order tensors. In: International Conference on Machine Learning (ICML) (2013)
Khoromskij, B.N.: Tensors-structured numerical methods in scientific computing: Survey on recent advances. Chemometr. Intell. Lab. Syst. 110(1), 1–19 (2012)
Kolda, T.G., Bader, B.W.: Tensor decompositions and applications. SIAM Rev. 51(3), 455–500 (2009)
Kroonenberg, P.M.: Applied Multiway Data Analysis. Wiley, New York (2008)
Oseledets, I.V.: Tensor-train decomposition. SIAM J. Sci. Comput. 33, 2295–2317 (2011)
Shashua, A.: The applications of tensor factorization in inference, clustering, graph theory, coding and visual representation, 2012. Keynote talk at the 10th International Conference on Latent Variable Analysis and Signal Separation
Smilde, A., Bro, R., Geladi, P.: Multi-way Analysis Applications in the Chemical Sciences. Wiley, Chichester (2004)
Song, L., Ishteva, M., Parikh, A., Xing, E., Park, H.: Hierarchical tensor decomposition of latent tree graphical models. In: International Conference on Machine Learning (ICML) (2013)
Vervliet, N., Debals, O., Sorber, L., De Lathauwer, L.: Breaking the curse of dimensionality using decompositions of incomplete tensors. Sig. Process. Mag. IEEE 31(5), 71–79 (2014)
Yılmaz, Y.K., Cemgil, A.T.: Algorithms for probabilistic latent tensor factorization. Sig. Process. 92(8), 1853–1863 (2012)
Acknowledgments
This work was supported in part by the Fund for Scientific Research (FWO-Vlaanderen), by FWO project G.0280.15N, by the Flemish Government (Methusalem), by the Belgian Government through the Inter-university Poles of Attraction (IAP VII) Program (DYSCO II, Dynamical systems, control and optimization, 2012–2017), by the ERC Advanced Grant SNLSID under contract 320378, and by the ERC Starting Grant SLRA under contract 258581. Mariya Ishteva was an FWO Pegasus Marie Curie Fellow.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Ishteva, M. (2015). Tensors and Latent Variable Models. In: Vincent, E., Yeredor, A., Koldovský, Z., Tichavský, P. (eds) Latent Variable Analysis and Signal Separation. LVA/ICA 2015. Lecture Notes in Computer Science(), vol 9237. Springer, Cham. https://doi.org/10.1007/978-3-319-22482-4_6
Download citation
DOI: https://doi.org/10.1007/978-3-319-22482-4_6
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-22481-7
Online ISBN: 978-3-319-22482-4
eBook Packages: Computer ScienceComputer Science (R0)