Principal component analysis (PCA) is a very important linear method for dimensionality reduction. It measures data distortion globally by the Frobenius norm of the matrix of data difference. The reduced data of PCA consists of several leading eigenvectors of the covariance matrix of the data set. Hence, PCA may not preserve the local separation of the original data. To respect local properties of data in dimensionality reduction (DR), we employ Lipschitz embedding. Random projection is a powerful method to construct Lipschitz mappings to realize dimensionality reduction with a high probability. Random projection does not introduce a significant distortion when the dimension and cardinality of data both are large. It randomly projects the original high-dimensional data into a lower-dimensional subspace. Because the projection costs linear computational time, the method is computationally efficient, yet produces sufficient accuracy with a high probability. In Section 7.1, we give a review of Lipschitz embedding. In Section 7.2, we introduce random matrices and random projection algorithms. In Section 7.3, the justification of the validity of random projection is presented in detail. Particularly, Johnson and Lindenstrauss Lemma will be proved in this section. The applications of random projection are given in Section 7.4.
KeywordsPrincipal Component Analysis Dimensionality Reduction Face Recognition Singular Value Decomposition Random Matrice
Unable to display preview. Download preview PDF.
- Indyk, P., Motwani, R.: Approximate nearest neighbors: towards removing the curse of dimensionality. In: Proc. 30th Symp. on Theory of Computing, vol. 13, pp. 604–613 (1998).Google Scholar
- Achlioptas, D.: Database-friendly random projections. Proc. 20th PODS pp. 274–281 (2001).Google Scholar
- Szlam, A.: Non-stationary analysis on datasets and applications. Ph.D. thesis, Yale University (2006).Google Scholar
- Goel, N., Bebis, G., Nefian, A.: Face recognition experiments with random projection (2004).Google Scholar
- Dasgupta, S.: Experiments with random projection. Proc. Uncertainty in Artificial Intelligence (2000).Google Scholar
- Martinez, A.M., Benavente, R.: Cvc technical report 24. Ph.D. Thesis, University of Cambridge and ATT Laboratories Cambridge (1998).Google Scholar
- Kaski, S.: Dimensionality reduction by random mapping:fast similarity computation for clustering. In: Proc. Int. Joint Conf. on Neural Networks, vol. 1, pp. 413–418 (1998).Google Scholar
- Lin, J., Gunopulos, D.: Dimensionality reduction by random projection and latent semantic indexing. In: Proc. of SDM (2003).Google Scholar
- Sulić, V., J. Perš, Kristan, M., Kovačič, S.: Dimensionality reduction for distributed vision systems using random projection. In: International Conference on Pattern Recognition (2010).Google Scholar
- Martinsson, P.G., Rokhlin, V., Tygert, M.: A randomized algorithm for the approximation of matrices. Tech. Rep. 1361, Dept. of Computer Science, Yale University (2006).Google Scholar