t-Distributed Stochastic Neighbor Embedding with Inhomogeneous Degrees of Freedom

  • Jun KitazonoEmail author
  • Nistor Grozavu
  • Nicoleta Rogovschi
  • Toshiaki Omori
  • Seiichi Ozawa
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9949)


One of the dimension reduction (DR) methods for data-visualization, t-distributed stochastic neighbor embedding (t-SNE), has drawn increasing attention. t-SNE gives us better visualization than conventional DR methods, by relieving so-called crowding problem. The crowding problem is one of the curses of dimensionality, which is caused by discrepancy between high and low dimensional spaces. However, in t-SNE, it is assumed that the strength of the discrepancy is the same for all samples in all datasets regardless of ununiformity of distributions or the difference in dimensions, and this assumption sometimes ruins visualization. Here we propose a new DR method inhomogeneous t-SNE, in which the strength is estimated for each point and dataset. Experimental results show that such pointwise estimation is important for reasonable visualization and that the proposed method achieves better visualization than the original t-SNE.


SNE t-SNE Dimensionality reduction Degrees of freedom 



This work was partially supported by the Grant for Enhancement of International Research from Kobe University, and Grants-in-Aid for Young Scientists (B) [No. 15K16064 (J.K.)] from the MEXT of Japan.


  1. 1.
    Tenenbaum, J.B., De Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290(5500), 2319–2323 (2000)CrossRefGoogle Scholar
  2. 2.
    Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000)CrossRefGoogle Scholar
  3. 3.
    Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput. 15(6), 1373–1396 (2003)CrossRefzbMATHGoogle Scholar
  4. 4.
    Hinton, G.E., Roweis, S.T.: Stochastic neighbor embedding. In: Advances in Neural Information Processing Systems, vol. 15, pp. 833–840. MIT Press, Cambridge (2002)Google Scholar
  5. 5.
    Lafon, S., Lee, A.B.: Diffusion maps and coarse-graining: a unified framework for dimensionality reduction, graph partitioning, and data set parameterization. IEEE Trans. Pattern Anal. Mach. Intell. 28(9), 1393–1403 (2006)CrossRefGoogle Scholar
  6. 6.
    van der Maaten, L., Hinton, G.: Visualizing data using t-SNE. J. Mach. Learn. Res. 9, 2579–2605 (2008)zbMATHGoogle Scholar
  7. 7.
    van der Maaten, L.: Learning a parametric embedding by preserving local structure. In: International Conference on Artificial Intelligence and Statistics, JMLR W&CP, vol. 5 (2009)Google Scholar
  8. 8.
    Vladymyrov, M., Carreira-Perpinán, M.: Entropic affinities: properties and efficient numerical computation. In: Proceedings of the 30th International Conference on Machine Learning, pp. 477–485 (2013)Google Scholar
  9. 9.
    Vladymyrov, M., Carreira-Perpinán, M.A.: Linear-time training of nonlinear low-dimensional embeddings. In: Proceedings of AISTATS 2014, International Conference on Artificial Intelligence and Statistics, JMLR W&CP, vol. 33, pp. 968–977 (2014)Google Scholar
  10. 10.
    van der Maaten, L.: Barnes-hut-sne. arXiv preprint arXiv:1301.3342 (2013)
  11. 11.
    van der Maaten, L.: Accelerating t-sne using tree-based algorithms. J. Mach. Learn. Res. 15(1), 3221–3245 (2014)MathSciNetzbMATHGoogle Scholar
  12. 12.
    Parviainen, E.: A Graph-based n-body Approximation with Application to Stochastic Neighbor Embedding. Neural Netw. 75, 1–11 (2016)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  • Jun Kitazono
    • 1
    Email author
  • Nistor Grozavu
    • 2
  • Nicoleta Rogovschi
    • 3
  • Toshiaki Omori
    • 1
  • Seiichi Ozawa
    • 1
  1. 1.Graduate School of EngineeringKobe UniversityKobeJapan
  2. 2.LIPN UMR CNRS 7030, Sorbonne Paris Cité, Université Paris 13VilletaneuseFrance
  3. 3.LIPADE, Université Paris DescartesParisFrance

Personalised recommendations