Advertisement

Some Recent Advances in Multiscale Geometric Analysis of Point Clouds

  • Guangliang Chen
  • Anna V. Little
  • Mauro Maggioni
  • Lorenzo Rosasco
Chapter
Part of the Applied and Numerical Harmonic Analysis book series (ANHA)

Abstract

We discuss recent work based on multiscale geometric analyis for the study of large data sets that lie in high-dimensional spaces but have low-dimensional structure. We present three applications: the first one to the estimation of intrinsic dimension of sampled manifolds, the second one to the construction of multiscale dictionaries, called Geometric Wavelets, for the analysis of point clouds, and the third one to the inference of point clouds modeled as unions of multiple planes of varying dimensions.

Keywords

Point Cloud Singular Value Decomposition Intrinsic Dimension Principle Component Analysis Spectral Cluster 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    M. Aharon, M. Elad, and A. Bruckstein. K-SVD: Design of dictionaries for sparse representation. In PROCEEDINGS OF SPARS 05’, pages 9–12, 2005.Google Scholar
  2. 2.
    W.K. Allard, G. Chen, and M. Maggioni. Multiscale geometric methods for data sets II: Geometric wavelets. in preparation, 2010.Google Scholar
  3. 3.
    M. Belkin and P. Niyogi. Laplacian eigenmaps and spectral techniques for embedding and clustering. In Advances in Neural Information Processing Systems 14 (NIPS 2001), pages 585–591. MIT Press, Cambridge, 2001.Google Scholar
  4. 4.
    M. Belkin and P. Niyogi. Using manifold structure for partially labelled classification. Advances in NIPS, 15, 2003.Google Scholar
  5. 5.
    M. Belkin and P. Niyogi. Semi-supervised learning on Riemannian manifolds. Machine Learning, 56(Invited Special Issue on Clustering):209–239, 2004. TR-2001-30, Univ. Chicago, CS Dept., 2001.Google Scholar
  6. 6.
    M. Belkin, P. Niyogi, and V. Sindhwani. Manifold regularization: A geometric framework for learning from labeled and unlabeled examples. Journal of Machine Learning Research, (7):2399–2434, Nov. 2006.Google Scholar
  7. 7.
    J.J. Benedetto and M.W. Frazier eds. Wavelets, Mathematics and Applications. CRC Press, 1993.Google Scholar
  8. 8.
    P. Binev, A. Cohen, W. Dahmen, R.A. DeVore, and V. Temlyakov. Universal algorithms for learning theory part i: piecewise constant functions. J. Mach. Learn. Res., 6:1297–1321, 2005.MathSciNetMATHGoogle Scholar
  9. 9.
    P. Binev, A. Cohen, W. Dahmen, R.A. DeVore, and V. Temlyakov. Universal algorithms for learning theory part ii: piecewise polynomial functions. Constr. Approx., 26(2):127–152, 2007.MathSciNetMATHCrossRefGoogle Scholar
  10. 10.
    I. Borg and P. Groenen. Modern Multidimensional Scaling : Theory and Applications. Springer, 1996.Google Scholar
  11. 11.
    S. Borovkova, R. Burton, and H. Dehling. Consistency of the Takens estimator for the correlation dimension. Ann. Appl. Probab., 9(2):376–390, 1999.MathSciNetMATHCrossRefGoogle Scholar
  12. 12.
    J. Bourgain. On Lipschitz embedding of finite metric spaces into Hilbert space. Isr. Journ. Math., pages 46–52, 1985.Google Scholar
  13. 13.
    F. Camastra and A. Vinciarelli. Intrinsic dimension estimation of data: An approach based on grassberger-procaccia’s algorithm. Neural Processing Letters, 14(1):27–34, 2001.MATHCrossRefGoogle Scholar
  14. 14.
    F. Camastra and A. Vinciarelli. Estimating the intrinsic dimension of data with a fractal-based method. IEEE P.A.M.I., 24(10):1404–1410, 2002.Google Scholar
  15. 15.
    Wenbo Cao and Robert Haralick. Nonlinear manifold clustering by dimensionality. icpr, 1:920–924, 2006.Google Scholar
  16. 16.
    K. Carter, A. O. Hero, and R. Raich. De-biasing for intrinsic dimension estimation. Statistical Signal Processing, 2007. SSP ’07. IEEE/SP 14th Workshop on, pages 601–605, Aug. 2007.Google Scholar
  17. 17.
    Kevin M. Carter, Alfred O. Hero, and Raviv Raich. De-biasing for intrinsic dimension estimation. Statistical Signal Processing, 2007. SSP ’07. IEEE/SP 14th Workshop on, pages 601–605, Aug. 2007.Google Scholar
  18. 18.
    K.M. Carter and A.O. Hero. Variance reduction with neighborhood smoothing for local intrinsic dimension estimation. Acoustics, Speech and Signal Processing, 2008. ICASSP 2008. IEEE International Conference on, pages 3917–3920, 31 2008-April 4 2008.Google Scholar
  19. 19.
    K.M. Carter and A.O. Hero. Variance reduction with neighborhood smoothing for local intrinsic dimension estimation. Acoustics, Speech and Signal Processing, 2008. ICASSP 2008. IEEE International Conference on, pages 3917–3920, 31 2008-April 4 2008.Google Scholar
  20. 20.
    Tony F. Chan and Jianhong Shen. Image processing and analysis. Society for Industrial and Applied Mathematics (SIAM), Philadelphia, PA, 2005. Variational, PDE, wavelet, and stochastic methods.Google Scholar
  21. 21.
    M. Chaplain, M. Ganesh, and I. Graham. Spatio-temporal pattern formation on spherical surfaces: numerical simulation and application to solid tumor growth. J. Math. Biology, 42:387–423, 2001.MathSciNetMATHCrossRefGoogle Scholar
  22. 22.
    G. Chen and G. Lerman. Foundations of a multi-way spectral clustering framework for hybrid linear modeling. Found. Comput. Math., 9:517–558, 2009. DOI 10.1007/s10208-009-9043-7.MathSciNetMATHCrossRefGoogle Scholar
  23. 23.
    G. Chen and G. Lerman. Spectral curvature clustering (scc). Int. J. Comput. Vis., 81:317–330, 2009. DOI 10.1007/s11263-008-0178-9.CrossRefGoogle Scholar
  24. 24.
    G. Chen and M. Maggioni. Multiscale geometric methods for data sets III: multiple planes. in preparation, 2010.Google Scholar
  25. 25.
    G. Chen and M.Maggioni. Multiscale geometric wavelets for the analysis of point clouds. to appear in Proc. CISS 2010, 2010.Google Scholar
  26. 26.
    M. Chen, J. Silva, J. Paisley, C. Wang, D. Dunson, and L. Carin. Compressive sensing on manifolds using a nonparametric mixture of factor analyzers: Algorithm and performance bounds. IEEE Trans. Signal Processing, 58(12):6140–6155, Dec. 2010.Google Scholar
  27. 27.
    Scott Shaobing Chen, David L. Donoho, and Michael A. Saunders. Atomic decomposition by basis pursuit. SIAM Journal on Scientific Computing, 20(1):33–61, 1998.Google Scholar
  28. 28.
    C.K. Chui. An introduction to wavelets. Academic Press, San Diego, 1992.MATHGoogle Scholar
  29. 29.
    R. R. Coifman, S. Lafon, A. B. Lee, M. Maggioni, B. Nadler, F. Warner, and S. W. Zucker. Geometric diffusions as a tool for harmonic analysis and structure definition of data: Diffusion maps. PNAS, 102(21):7426–7431, 2005.CrossRefGoogle Scholar
  30. 30.
    R.R. Coifman and S. Lafon. Diffusion maps. Appl. Comp. Harm. Anal., 21(1):5–30, 2006.MathSciNetMATHCrossRefGoogle Scholar
  31. 31.
    J. Costa and A.O. Hero. Learning intrinsic dimension and intrinsic entropy of high dimensional datasets. In Proc. of EUSIPCO, Vienna, 2004.Google Scholar
  32. 32.
    J.A. Costa and A.O. Hero. Geodesic entropic graphs for dimension and entropy estimation in manifold learning. Signal Processing, IEEE Transactions on, 52(8):2210–2221, Aug. 2004.Google Scholar
  33. 33.
    J.A. Costa and A.O. Hero. Geodesic entropic graphs for dimension and entropy estimation in manifold learning. Signal Processing, IEEE Transactions on, 52(8):2210–2221, Aug. 2004.Google Scholar
  34. 34.
    I. Daubechies. Ten lectures on wavelets. Society for Industrial and Applied Mathematics, 1992.Google Scholar
  35. 35.
    G. David and S. Semmes. Uniform Rectifiability and Quasiminimizing Sets of Arbitrary Codimension. AMS.Google Scholar
  36. 36.
    G. David and S. Semmes. Singular integrals and rectifiable sets in R n: Au-delá des graphes lipschitziens. Astérisque, (193):152, 1991.Google Scholar
  37. 37.
  38. 38.
    Guy David. Morceaux de graphes lipschitziens et intégrales singulières sur une surface. Rev. Mat. Iberoamericana, 4(1):73–114, 1988.Google Scholar
  39. 39.
    Guy David. Wavelets and singular integrals on curves and surfaces, volume 1465 of Lecture Notes in Mathematics. Springer-Verlag, Berlin, 1991.Google Scholar
  40. 40.
    Guy David. Wavelets and Singular Integrals on Curves and Surfaces. Springer-Verlag, 1991.Google Scholar
  41. 41.
    Guy David and Stephen Semmes. Analysis of and on uniformly rectifiable sets, volume 38 of Mathematical Surveys and Monographs. American Mathematical Society, Providence, RI, 1993.Google Scholar
  42. 42.
    D. L. Donoho and Ana G Flesia. Can recent innovations in harmonic analysis ‘explain’ key findings in natural image statistics? Network: Comput. Neural Syst., 12:371–393, 2001.Google Scholar
  43. 43.
    D. L. Donoho and C. Grimes. When does isomap recover natural parameterization of families of articulated images? Technical Report Tech. Rep. 2002-2027, Department of Statistics, Stanford University, August 2002.Google Scholar
  44. 44.
    D. L. Donoho and Carrie Grimes. Hessian eigenmaps: new locally linear embedding techniques for high-dimensional data. Proc. Nat. Acad. Sciences, pages 5591–5596, March 2003. also tech. report, Statistics Dept., Stanford University.Google Scholar
  45. 45.
    D. L. Donoho, O. Levi, J.-L. Starck, and V. J. Martinez. Multiscale geometric analysis for 3-d catalogues. Technical report, Stanford Univ., 2002.Google Scholar
  46. 46.
    A. M. Farahmand and C. Szepesv and J.-Y. Audibert. Manifold-adaptive dimension estimation. Proc. I.C.M.L., 2007.Google Scholar
  47. 47.
    A. M. Farahmand, Cs. Szepesvári, and J.-Y. Audibert. Manifold-adaptive dimension estimation. In Proceedings of the 24th international conference on Machine learning, page 265–272, 2007.Google Scholar
  48. 48.
    S. Atev G. Chen and G. Lerman. Kernel spectral curvature clustering (kscc). In The 4th ICCV International Workshop on Dynamical Vision, Kyoto, Japan, 2009.Google Scholar
  49. 49.
    Peter Grassberger and Itamar Procaccia. Measuring the strangeness of strange attractors. Phys. D, 9(1-2):189–208, 1983.Google Scholar
  50. 50.
    G. Haro, G. Randall, and G. Sapiro. Translated poisson mixture model for stratification learning. Int. J. Comput. Vision, 80(3):358–374, 2008.CrossRefGoogle Scholar
  51. 51.
    Gloria Haro, Gregory Randall, and Guillermo Sapiro. Translated poisson mixture model for stratification learning. Int. J. Comput. Vision, 80(3):358–374, 2008.Google Scholar
  52. 52.
    X. He, S. Yan, Y. Hu, P. Niyogi, and H.-J. Zhang. Face recognition using laplacianfaces. IEEE Trans. pattern analysis and machine intelligence, 27(3):328–340, 2005.CrossRefGoogle Scholar
  53. 53.
    M. Hein and Y. Audibert. Intrinsic dimensionality estimation of submanifolds in euclidean space. In S. Wrobel De Raedt, L., editor, ICML Bonn, pages 289 – 296, 2005.Google Scholar
  54. 54.
    Iain M. Johnstone. On the distribution of the largest eigenvalue in principal components analysis. Ann. Stat., 29(2):295–327, April 2001.Google Scholar
  55. 55.
    Peter W. Jones. Rectifiable sets and the traveling salesman problem. Invent. Math., 102(1):1–15, 1990.Google Scholar
  56. 56.
    P.W. Jones. Rectifiable sets and the traveling salesman problem. Inventiones Mathematicae, 102:1–15, 1990.MathSciNetMATHCrossRefGoogle Scholar
  57. 57.
    G. Karypis and V. Kumar. A fast and high quality multilevel scheme for partitioning irregular graphs. SIAM Journal on Scientific Computing, 20(1):359–392, 1999.MathSciNetMATHCrossRefGoogle Scholar
  58. 58.
    V. I. Koltchinskii. Empirical geometry of multivariate data: a deconvolution approach. Ann. Stat., 28(2):591–629, 2000.MathSciNetMATHCrossRefGoogle Scholar
  59. 59.
    R. Krauthgamer, J. Lee, M. Mendel, and A. Naor. Measured descent: A new embedding method for finite metrics, 2004.Google Scholar
  60. 60.
    S. Lafon. Diffusion maps and geometric harmonics. PhD thesis, Yale University, 2004.Google Scholar
  61. 61.
    E. Levina and P. Bickel. Maximum likelihood estimation of intrinsic dimension. In Advances in NIPS 17,Vancouver, Canada, 2005.Google Scholar
  62. 62.
    Elizaveta Levina and Peter J. Bickel. Maximum likelihood estimation of intrinsic dimension. In Lawrence K. Saul, Yair Weiss, and Léon Bottou, editors, Advances in Neural Information Processing Systems 17, pages 777–784. MIT Press, Cambridge, MA, 2005.Google Scholar
  63. 63.
    A.V. Little, Y.-M. Jung, and M. Maggioni. Multiscale estimation of intrinsic dimensionality of data sets. In Proc. A.A.A.I., 2009.Google Scholar
  64. 64.
    A.V. Little, J. Lee, Y.-M. Jung, and M. Maggioni. Estimation of intrinsic dimensionality of samples from noisy low-dimensional manifolds in high dimensions with multiscale SVD. In Proc. S.S.P., 2009.Google Scholar
  65. 65.
    A.V. Little, M. Maggioni, and L. Rosasco. Multiscale geometric methods for data sets I: Estimation of intrinsic dimension. in preparation, 2010.Google Scholar
  66. 66.
    P-C. Lo. Three dimensional filtering approach to brain potential mapping. IEEE Tran. on biomedical engineering, 46(5):574–583, 1999.Google Scholar
  67. 67.
    S. Mahadevan, K. Ferguson, S. Osentoski, and M. Maggioni. Simultaneous learning of representation and control in continuous domains. In AAAI. AAAI Press, 2006.Google Scholar
  68. 68.
    S. Mahadevan and M. Maggioni. Value function approximation with diffusion wavelets and laplacian eigenfunctions. In University of Massachusetts, Department of Computer Science Technical Report TR-2005-38; Proc. NIPS 2005, 2005.Google Scholar
  69. 69.
    Julien Mairal, Francis Bach, Jean Ponce, and Guillermo Sapiro. Online dictionary learning for sparse coding. In ICML, page 87, 2009.Google Scholar
  70. 70.
    S.G. Mallat. A wavelet tour in signal processing. Academic Press, 1998.Google Scholar
  71. 71.
    Benoit B. Mandelbrot and Richard L. Hudson. The (mis)behavior of markets. Basic Books, New York, 2004. A fractal view of risk, ruin, and reward.Google Scholar
  72. 72.
    A. Ng, M. Jordan, and Y. Weiss. On spectral clustering: Analysis and an algorithm, 2001.Google Scholar
  73. 73.
    P. Niyogi, I. Matveeva, and M. Belkin. Regression and regularization on large graphs. Technical report, University of Chicago, Nov. 2003.Google Scholar
  74. 74.
    Debashis Paul. Asymptotics of sample eigenstructure for a large dimensional spiked covariance model. Statistica Sinica, 17:1617–1642, 2007.MathSciNetMATHGoogle Scholar
  75. 75.
    M. Raginsky and S. Lazebnik. Estimation of intrinsic dimensionality using high-rate vector quantization. Proc. NIPS, pages 1105–1112, 2005.Google Scholar
  76. 76.
    ST Roweis and LK Saul. Nonlinear dimensionality reduction by locally linear embedding. Science, 290:2323–2326, 2000.CrossRefGoogle Scholar
  77. 77.
    M. Rudelson. Random vectors in the isotropic position. J. of Functional Analysis, 164(1):60–72, 1999.MathSciNetMATHCrossRefGoogle Scholar
  78. 78.
    L.K. Saul, K.Q. Weinberger, F.H. Ham, F. Sha, and D.D. Lee. Spectral methods for dimensionality reduction, chapter Semisupervised Learning. MIT Press, 2006.Google Scholar
  79. 79.
    F. Sha and L.K. Saul. Analysis and extension of spectral methods for nonlinear dimensionality reduction. Proc. ICML, pages 785–792, 2005.Google Scholar
  80. 80.
    J. Shi and J. Malik. Normalized cuts and image segmentation. IEEE PAMI, 22:888–905, 2000.Google Scholar
  81. 81.
    Jack Silverstein. On the empirical distribution of eigenvalues of large dimensional information-plus-noise type matrices. Journal of Multivariate Analysis, 98:678–694, 2007.MathSciNetMATHCrossRefGoogle Scholar
  82. 82.
    A. Szlam and G. Sapiro. Discriminative k-metrics. In Proceedings of the 26th Annual International Conference on Machine Learning, pages 1009–1016, 2009.Google Scholar
  83. 83.
    A.D. Szlam, M. Maggioni, and R.R. Coifman. Regularization on graphs with function-adapted diffusion processes. JMLR, (9):1711–1739, Aug 2008.Google Scholar
  84. 84.
    A.D. Szlam, M. Maggioni, R.R. Coifman, and J.C. Bremer Jr. Diffusion-driven multiscale analysis on manifolds and graphs: top-down and bottom-up constructions. volume 5914-1, page 59141D. SPIE, 2005.Google Scholar
  85. 85.
    Floris Takens. On the numerical determination of the dimension of an attractor. In Dynamical systems and bifurcations (Groningen, 1984), volume 1125 of Lecture Notes in Math., pages 99–106. Springer, Berlin, 1985.Google Scholar
  86. 86.
    J.B. Tenenbaum, V. de Silva, and J.C. Langford. A global geometric framework for nonlinear dimensionality reduction. Science, 290:2319–2323, 2000.CrossRefGoogle Scholar
  87. 87.
    M. B. Wakin, D. L. Donoho, H. Choi, and R. G. Baraniuk. The multiscale structure of non-differentiable image manifolds. In SPIE Wavelets XI, San Diego, July 2005.Google Scholar
  88. 88.
    K.Q. Weinberger, F. Sha, and L.K. Saul. Leaning a kernel matrix for nonlinear dimensionality reduction. Proc. ICML, pages 839–846, 2004.Google Scholar
  89. 89.
    Mladen Victor Wickerhauser. Adapted Wavelet Analysis from Theory to Software. A K Peters Ltd., Wellesley, MA, 1994. With a separately available computer disk (IBM-PC or Macintosh).Google Scholar
  90. 90.
    Z. Zhang and H. Zha. Principal manifolds and nonlinear dimension reduction via local tangent space alignement. Technical Report CSE-02-019, Department of computer science and engineering, Pennsylvania State University, 2002.Google Scholar
  91. 91.
    M. Zhou, H. Chen, J. Paisley, L. Ren, G. Sapiro, and L. Carin. Non-parametric Bayesian dictionary learning for sparse image representations. In Neural and Information Processing Systems (NIPS), 2009.Google Scholar
  92. 92.
    Xiaojin Zhu, Zoubin Ghahramani, and John Lafferty. Semi-supervised learning using gaussian fields and harmonic functions. In ICML, pages 912–919, 2003.Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2011

Authors and Affiliations

  • Guangliang Chen
  • Anna V. Little
  • Mauro Maggioni
    • 1
  • Lorenzo Rosasco
  1. 1.Duke UniversityDurhamUSA

Personalised recommendations