Advertisement

L2 Normalized Data Clustering Through the Dirichlet Process Mixture Model of von Mises Distributions with Localized Feature Selection

  • Wentao FanEmail author
  • Nizar Bouguila
  • Yewang Chen
  • Ziyi Chen
Chapter
Part of the Unsupervised and Semi-Supervised Learning book series (UNSESUL)

Abstract

In this chapter, we propose a probabilistic model based-approach for clustering L2 normalized data. Our approach is based on the Dirichlet process mixture model of von Mises (VM) distributions. Since it assumes an infinite number of clusters (i.e., the mixture components), the Dirichlet process mixture model of VM distributions can also be considered as the infinite VM mixture model. Comparing with finite mixture model in which the number of mixture components have to be determined through extra efforts, the infinite mixture VM model is a nonparametric model such that the number of mixture components is assumed to be infinite initially and will be inferred automatically during the learning process. To improve clustering performance for high-dimensional data, a localized feature selection scheme is integrated into the infinite VM mixture model which can effectively detect irrelevant features based on the estimated feature saliencies. In order to learn the proposed infinite mixture model with localized feature selection, we develop an effective approach using variational inference that can estimate model parameters and feature saliencies with closed-form solutions. Our model-based clustering approach is validated through two challenging applications, namely topic novelty detection and unsupervised image categorization.

Keywords

Clustering Spherical data von Mises distribution Mixture models Feature selection Novelty detection Image categorization 

Notes

Acknowledgements

The completion of this work was supported by the National Natural Science Foundation of China (61876068), the Natural Science Foundation of Fujian Province (2018J01094), and the Promotion Program for Young and Middle-aged Teacher in Science and Technology Research of Huaqiao University (ZQNPY510).

References

  1. 1.
    McLachlan, G., Peel, D.: Finite Mixture Models. Wiley, New York (2000)CrossRefGoogle Scholar
  2. 2.
    Fan, W., Bouguila, N., Ziou, D.: Variational learning for finite Dirichlet mixture models and applications. IEEE Trans. Neural Netw. Learn. Syst. 23(5), 762–774 (2012)CrossRefGoogle Scholar
  3. 3.
    Fan, W., Sallay, H., Bouguila, N.: Online learning of hierarchical Pitman–Yor process mixture of generalized Dirichlet distributions with feature selection. IEEE Trans. Neural Netw. Learn. Syst. 28(9), 2048–2061 (2017)MathSciNetGoogle Scholar
  4. 4.
    Fan, W., Bouguila, N., Liu, X.: A hierarchical Dirichlet process mixture of GID distributions with feature selection for spatio-temporal video modeling and segmentation. In: 2017 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2017, pp. 2771–2775. IEEE, Piscataway (2017)Google Scholar
  5. 5.
    Fan, W., Bouguila, N.: Online learning of a Dirichlet process mixture of Beta-Liouville distributions via variational inference. IEEE Trans. Neural Netw. Learn. Syst. 24(11), 1850–1862 (2013)CrossRefGoogle Scholar
  6. 6.
    Fan, W., Bouguila, N.: Expectation propagation learning of a Dirichlet process mixture of Beta-Liouville distributions for proportional data clustering. Eng. Appl. Artif. Intell. 43, 1–14 (2015)CrossRefGoogle Scholar
  7. 7.
    Amayri, O., Bouguila, N.: Infinite Langevin mixture modeling and feature selection. In: 2016 IEEE International Conference on Data Science and Advanced Analytics, DSAA 2016, pp. 149–155. IEEE, Piscataway (2016)Google Scholar
  8. 8.
    Amayri, O., Bouguila, N.: RJMCMC learning for clustering and feature selection of l2-normalized vectors. In: International Conference on Control, Decision and Information Technologies, CoDIT 2016, pp. 269–274. IEEE, Piscataway (2016)Google Scholar
  9. 9.
    Amayri, O., Bouguila, N.: A Bayesian analysis of spherical pattern based on finite Langevin mixture. Appl. Soft Comput. 38, 373–383 (2016)CrossRefGoogle Scholar
  10. 10.
    Amayri, O., Bouguila, N.: On online high-dimensional spherical data clustering and feature selection. Eng. Appl. Artif. Intell. 26(4), 1386–1398 (2013)CrossRefGoogle Scholar
  11. 11.
    Korwar, R.M., Hollander, M.: Contributions to the theory of Dirichlet processes. Ann. Probab. 1, 705–711 (1973)MathSciNetCrossRefGoogle Scholar
  12. 12.
    Ferguson, T.S.: A Bayesian analysis of some nonparametric problems. Ann. Stat. 1(2), 209–230 (1973)MathSciNetCrossRefGoogle Scholar
  13. 13.
    Blei, D.M., Jordan, M.I.: Variational inference for Dirichlet process mixtures. Bayesian Anal. 1, 121–144 (2005)MathSciNetCrossRefGoogle Scholar
  14. 14.
    Li, Y., Dong, M., Hua, J.: Simultaneous localized feature selection and model detection for Gaussian mixtures. IEEE Trans. Pattern Anal. Mach. Intell. 31, 953–960 (2009)CrossRefGoogle Scholar
  15. 15.
    Attias, H.: A variational Bayes framework for graphical models. In: Proceedings of the Advances in Neural Information Processing Systems (NIPS), pp. 209–215 (1999)Google Scholar
  16. 16.
    Jordan, M.I., Ghahramani, Z., Jaakkola, T.S., Saul, L.K.: An introduction to variational methods for graphical models. Mach. Learn. 37(2), 183–233 (1999)CrossRefGoogle Scholar
  17. 17.
    Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, Berlin (2006)zbMATHGoogle Scholar
  18. 18.
    Fan, W., Bouguila, N.: Nonparametric localized feature selection via a Dirichlet process mixture of generalized Dirichlet distributions. In: Neural Information Processing—19th International Conference, ICONIP 2012, pp. 25–33 (2012)Google Scholar
  19. 19.
    Fan, W., Bouguila, N., Ziou, D.: Unsupervised anomaly intrusion detection via localized Bayesian feature selection. In: 11th IEEE International Conference on Data Mining, ICDM 2011, pp. 1032–1037. IEEE, Piscataway (2011)Google Scholar
  20. 20.
    Law, M.H.C., Figueiredo, M.A.T., Jain, A.K.: Simultaneous feature selection and clustering using mixture models. IEEE Trans. Pattern Anal. Mach. Intell. 26(9), 1154–1166 (2004)CrossRefGoogle Scholar
  21. 21.
    Sethuraman, J.: A constructive definition of Dirichlet priors. Stat. Sin. 4, 639–650 (1994)MathSciNetzbMATHGoogle Scholar
  22. 22.
    Taghia, J., Ma, Z., Leijon, A.: Bayesian estimation of the von Mises-Fisher mixture model with variational inference. IEEE Trans. Pattern Anal. Mach. Intell. 36(9), 1701–1715 (2014)CrossRefGoogle Scholar
  23. 23.
    McCallum, A.K.: Bow: A toolkit for statistical language modeling, text retrieval, classification and clustering. http://www.cs.cmu.edu/~mccallum/bow (1996)
  24. 24.
    Nilsback, M.-E., Zisserman, A.: A visual vocabulary for flower classification. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), vol. 2, pp. 1447–1454. IEEE, Piscataway (2006)Google Scholar
  25. 25.
    Ke, Y., Sukthankar, R.: PCA-SIFT: A more distinctive representation for local image descriptors. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), pp. 506–513. IEEE, Piscataway (2004)Google Scholar
  26. 26.
    Mikolajczyk, K., Schmid, C.: A performance evaluation of local descriptors. IEEE Trans. Pattern Anal. Mach. Intell. 27(10), 1615–1630 (2005)CrossRefGoogle Scholar
  27. 27.
    Elkan, C.: Using the triangle inequality to accelerate k-means. In: Proceedings of the Twentieth International Conference on International Conference on Machine Learning, pp. 147–153 (2003)Google Scholar
  28. 28.
    Hofmann, T.: Unsupervised learning by probabilistic latent semantic analysis. Mach. Learn. 42(1/2), 177–196 (2001)CrossRefGoogle Scholar
  29. 29.
    Bosch, A., Zisserman, A., Munoz, X.: Scene classification via pLSA. In: Proceedings of the 9th European Conference on Computer Vision (ECCV), pp. 517–530. Springer, Berlin (2006)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Wentao Fan
    • 1
    Email author
  • Nizar Bouguila
    • 2
  • Yewang Chen
    • 1
  • Ziyi Chen
    • 1
  1. 1.Department of Computer Science and TechnologyHuaqiao UniversityXiamenChina
  2. 2.Concordia Institute for Information Systems EngineeringConcordia UniversityMontrealCanada

Personalised recommendations