Advertisement

Latent Semantic Analysis (LSA): Reduction of Dimensions

  • Grigori Sidorov
Chapter
Part of the SpringerBriefs in Computer Science book series (BRIEFSCOMPUTER)

Abstract

After building the vector space model, we can represent and compare any type of objects of our study. Now we can discuss the question whether we can improve the vector space we have built. The importance of this question is related to the fact that the vector space model can have thousands of features, and possibly many of these features are redundant. Is there any way to get rid of the features that are not that important? Latent Semantic Analysis allows constructing new vector space model with smaller number of dimensions.

Bibliography

  1. 18.
    Dumais, S.T.: Latent Semantic Analysis. Annual Review of Information Science and Technology 38: 188 (2005)CrossRefGoogle Scholar

Copyright information

© The Author(s), under exclusive licence to Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Grigori Sidorov
    • 1
  1. 1.Instituto Politécnico NacionalCentro de Investigación en ComputaciónMexico CityMexico

Personalised recommendations