Skip to main content

Unsupervised Learning Using the Tensor Voting Graph

  • Conference paper
  • First Online:
Scale Space and Variational Methods in Computer Vision (SSVM 2015)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 9087))

Abstract

Tensor Voting is a local, non parametric method that provides an efficient way to learn the complex geometric manifold structure under a significant amount of outlier noise. The main limitation of the Tensor Voting framework is that it is strictly a local method, thus not efficient to infer the global properties of complex manifolds. We therefore suggest constructing a unique graph which we call the Tensor Voting Graph, in which the affinity is based on the contribution of neighboring points to a point local tangent space estimated by Tensor Voting. The Tensor Voting Graph compactly and effectively represents the global structure of the underlying manifold. We experimentally demonstrate that we can accurately estimate the geodesic distance on complex manifolds, and substantially outperform all state of the art competing approaches, especially when outliers are present. We also demonstrate our method’s superior ability to segment manifolds, first on synthetic data, then on standard data sets for a motion segmentation, with graceful degradation in the presence of noise.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation 15(6), 1373–1396 (2003)

    Article  MATH  Google Scholar 

  2. Coifman, R.R., Lafon, S., Lee, A.B., Maggioni, M., Warner, F., Zucker, S.: Geometric diffusions as a tool for harmonic analysis and structure definition of data: diffusion maps. In: Proceedings of the National Academy of Sciences, pp. 7426–7431 (2005)

    Google Scholar 

  3. Dijkstra, E.: Communication with an Automatic Computer. Ph.D thesis, University of Amsterdam (1959)

    Google Scholar 

  4. Donoho, D., Grimes, C.: Hessian eigenmaps: Locally linear embedding techniques for high dimensional data. Proceedings of the National Academy of Sciences of the United States of America 100, 5591–5596 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  5. Elhamifar, E., Vidal, R.: Sparse subspace clustering. In: CVPR, pp. 2790–2797 (2009)

    Google Scholar 

  6. Gong, D., Zhao, X., Medioni, G.: Robust multiple manifold structure learning. In: ICML (2012)

    Google Scholar 

  7. Mordohai, P., Medioni, G.: Tensor Voting: A Perceptual Organization Approach to Computer Vision and Machine Learning. Morgan & Claypool Publishers (2006)

    Google Scholar 

  8. Mordohai, P., Medioni, G.: Dimensionality estimation, manifold learning and function approximation using tensor voting. Journal of Machine Learning Research 11, 411–450 (2010)

    MATH  MathSciNet  Google Scholar 

  9. Ng, A., Jordan, M., Weiss, Y.: On spectral clustering: analysis and an algorithm. In: Advances in Neural Information Processing Systems, pp. 849–856 (2001)

    Google Scholar 

  10. Niyogi, P., Smale, S., Weinberger, S.: Finding the homology of submanifolds with high confidence from random samples. Discrete & Computational Geometry 39(1), 419–441 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  11. Roweis, S., Saul, L.: Nonlinear dimensionality reduction by locally linear embedding. SCIENCE 290, 2323–2326 (2000)

    Article  Google Scholar 

  12. Singer, A., Wu, H.: Vector diffusion maps and the connection laplacian. Communications on Pure and Applied Mathematics 65(8), 1067–1144 (2012)

    Article  MATH  MathSciNet  Google Scholar 

  13. Tenenbaum, J., de Silva, V., Langford, J.: A Global Geometric Framework for Nonlinear Dimensionality Reduction. Science 290(5500), 2319–2323 (2000)

    Article  Google Scholar 

  14. Vidal, R., Ma, Y., Sastry, S.: Generalized principal component analysis (gpca) (2003)

    Google Scholar 

  15. Wang, Y., Jiang, Y., Wu, Y., Zhou, Z.: Spectral clustering on multiple manifolds. IEEE Transactions on Neural Networks 22(7), 1149–1161 (2011)

    Article  Google Scholar 

  16. Zhang, Z., Zha, H.: Principal manifolds and nonlinear dimensionality reduction via tangent space alignment. SIAM Journal on Scientific Computing 26(1), 313–338 (2005)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shay Deutsch .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Deutsch, S., Medioni, G. (2015). Unsupervised Learning Using the Tensor Voting Graph. In: Aujol, JF., Nikolova, M., Papadakis, N. (eds) Scale Space and Variational Methods in Computer Vision. SSVM 2015. Lecture Notes in Computer Science(), vol 9087. Springer, Cham. https://doi.org/10.1007/978-3-319-18461-6_23

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-18461-6_23

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-18460-9

  • Online ISBN: 978-3-319-18461-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics