Dimension reduction is an important research topic in the area of unsupervised learning. Dimension reduction techniques aim to find a low-dimensional subspace that best represents a given set of data points. These techniques have a broad range of applications including data compression, visualization, exploratory data analysis, pattern recognition, etc.
In this chapter, we present three representative dimension reduction techniques: Singular Value Decomposition (SVD), Independent Component Analysis (ICA), and Local Linear Embedding (LLE). Dimension reduction based on singular value decomposition is also referred to as principal component analysis (PCA) by many papers in the literature.We start the chapter by discussing the goals and objectives of dimension reduction techniques, followed by detailed descriptions of SVD, ICA, and LLE. In the last section of the chapter, we provide a case study where the three techniques are applied to the same data set and the subspaces generated by these techniques are compared to reveal their characteristics.
KeywordsMutual Information Singular Value Decomposition Independent Component Analy Dimension Reduction Singular Vector
Unable to display preview. Download preview PDF.