Abstract
After building the vector space model, we can represent and compare any type of objects of our study. Now we can discuss the question whether we can improve the vector space we have built. The importance of this question is related to the fact that the vector space model can have thousands of features, and possibly many of these features are redundant. Is there any way to get rid of the features that are not that important? Latent Semantic Analysis allows constructing new vector space model with smaller number of dimensions.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Bibliography
Dumais, S.T.: Latent Semantic Analysis. Annual Review of Information Science and Technology 38: 188 (2005)
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2019 The Author(s), under exclusive licence to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Sidorov, G. (2019). Latent Semantic Analysis (LSA): Reduction of Dimensions. In: Syntactic n-grams in Computational Linguistics. SpringerBriefs in Computer Science. Springer, Cham. https://doi.org/10.1007/978-3-030-14771-6_4
Download citation
DOI: https://doi.org/10.1007/978-3-030-14771-6_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-14770-9
Online ISBN: 978-3-030-14771-6
eBook Packages: Computer ScienceComputer Science (R0)