Advertisement

Dimension reduction is an important research topic in the area of unsupervised learning. Dimension reduction techniques aim to find a low-dimensional subspace that best represents a given set of data points. These techniques have a broad range of applications including data compression, visualization, exploratory data analysis, pattern recognition, etc.

In this chapter, we present three representative dimension reduction techniques: Singular Value Decomposition (SVD), Independent Component Analysis (ICA), and Local Linear Embedding (LLE). Dimension reduction based on singular value decomposition is also referred to as principal component analysis (PCA) by many papers in the literature.We start the chapter by discussing the goals and objectives of dimension reduction techniques, followed by detailed descriptions of SVD, ICA, and LLE. In the last section of the chapter, we provide a case study where the three techniques are applied to the same data set and the subspaces generated by these techniques are compared to reveal their characteristics.

Keywords

Mutual Information Singular Value Decomposition Independent Component Analy Dimension Reduction Singular Vector 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media, LLC 2007

Personalised recommendations