Springer Nature is making SARS-CoV-2 and COVID-19 research free. View research | View latest news | Sign up for updates

Rejoinder on: “On active learning methods for manifold data”


We thank the discussants for their comments and careful reading of our manuscript, which have enhanced and complemented our presentation. We also thank the editors of TEST for this opportunity to clarify some aspects of our work in more detail. In what follows, we first address some points touched by both sets of discussants, and then consider comments made individually by each of them. We conclude with a description of a method that can improve the speed of the retraining required in the SSGP-AL method when used for classification by re-using previous learning as opposed to re-estimating the GP model from scratch at each AL cycle.

This is a preview of subscription content, log in to check access.


  1. Aggarwal CC (2017) High-Dimensional Outlier Detection: The Subspace Method. Springer, Cham, pp 149–184

  2. Angiulli F, Pizzuti C (2005) Outlier mining in large high-dimensional data sets. IEEE Trans Knowl Data Eng 17(2):203–215.

  3. Belkin M (2003) Problems of learning on manifolds. Ph.D. thesis, The University of Chicago

  4. Belkin M, Niyogi P (2005) Towards a theoretical foundation for Laplacian-based manifold methods. In: Proceedings of conference on learning theory

  5. Bordes A, Ertekin S, Weston J, Bottou L (2005) Fast kernel classifiers with online and active learning. J Mach Learn Res 6:1579–1619

  6. Bui TD, Nguyen CV, Turner RE (2017) Streaming sparse Gaussian process approximations. In: Advances in neural information processing systems. p 30

  7. de Matthews AGG, Hensman J, Turner R, Ghahramani Z (2016) On sparse variational methods and the Kullback-Lleibler divergence between stochastic processes. In: Proceedings of the 19th international conference on artificial intelligence and statistics. Proceedings of machine learning research. Cadiz, Spain, vol 51, pp 231–239

  8. Gal Y, Islam R, Ghahramani Z (2017) Deep bayesian active learning with image data. In: Proceedings of the 34th international conference on machine learning-volume 70. JMLR. org, pp 1183–1192

  9. Gramacy RB, Polson NG (2012) Particle learning of gaussian process models for sequential design and optimization. J Comput Graph Stat 20(1):102–118

  10. Huang H, Qin H, Yoo S, Yu D (2014) Physics-based anomaly detection defined on manifold space. ACM Trans Knowl Discov Data 9(2):141–1439.

  11. Keller F, Muller E, Bohm K (2012) Hics: High contrast subspaces for density-based outlier ranking. In: 2012 IEEE 28th international conference on data engineering. pp 1037–1048

  12. Kriegel HP, Kröger P, Schubert E, Zimek A (2009) Outlier detection in axis-parallel subspaces of high dimensional data. In: Theeramunkong T, Kijsirikul B, Cercone N, Ho TB (eds) Advances in knowledge discovery and data mining. Springer, Berlin, Heidelberg, pp 831–838

  13. Li H (2020) On active learning methods for manifold data. Ph.D. thesis, Dept. of Industrial and Manufacturing Engineering, The Pennsylvania State University (forthcoming)

  14. Muller E, Assent I, Steinhausen U, Seidl T (2008) Outrank: ranking outliers in high dimensional data. In: 2008 IEEE 24th international conference on data engineering workshop. pp 600–603

  15. Murphy KP (2012) Machine learning: a probabilistic perspective. The MIT Press, Cambridge

  16. Park C, Huang JZ, Ding Y (2011) Domain decomposition approach for fast Gaussian process regression of large spatial data sets. J Mach Learn Res 12:1697–1728

  17. Rasmussen CE, Williams CKI (2006) Gaussian processes for machine learning. Adaptive computation and machine learning. MIT Press, Cambridge

  18. Rendle S, Schmidt-Thieme L (2008) Online-updating regularized kernel matrix factorization models for large-scale recommender systems. In: Proceedings of the 2008 ACM conference on recommender systems, RecSys ’08. ACM, New York, NY, USA, pp 251–258. URL

  19. Sathe S, Aggarwal C (2016) Lodes: Local density meets spectral outlier detection. In: Proceedings of the 2016 SIAM international conference on data mining. pp. 171–179. URL

  20. Settles B (2012) Active learning. Morgan & Claypool, San Rafael

  21. Tajbakhsh SD, Aybat NS, Del Castillo E (2014) Sparse precision matrix selection for fitting gaussian random field models to large data sets. arXiv preprint arXiv:1405.5576

  22. Tajbakhsh SD, Aybat NS, Del Castillo E (2018) Generalized sparse precision matrix selection for fitting multivariate gaussian random fields to large data sets. Stat Sin 28:941–962

  23. Titsias MK (2009) Variational learning of inducing variables in sparse gaussian processes. Artif Intell Stat 12:567–574

  24. Vinueza A, Grudic G (2004) Unsupervised outlier detection and semi-supervised learning. Technical Report CU-CS-976-04, University of Colorado at Boulder, Dept. of Computer Science Technical Reports 914. Accessed 19 Dec 2019

  25. Zhu X, Ghahramani Z, Lafferty J (2003) Semi-supervised learning using Gaussian fields and harmonic functions. In: Proceedings of the twentieth international conference on machine learning (ICML-03)

Download references

Author information

Correspondence to Enrique Del Castillo.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This rejoinder refers to the comments available at:,

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Li, H., Del Castillo, E. & Runger, G. Rejoinder on: “On active learning methods for manifold data”. TEST 29, 42–49 (2020).

Download citation