Abstract
Subspace clustering algorithms are notorious for their scalability issues because building and processing large affinity matrices are demanding. In this paper, we introduce a method that simultaneously learns an embedding space along subspaces within it to minimize a notion of reconstruction error, thus addressing the problem of subspace clustering in an end-to-end learning paradigm. To achieve our goal, we propose a scheme to update subspaces within a deep neural network. This in turn frees us from the need of having an affinity matrix to perform clustering. Unlike previous attempts, our method can easily scale up to large datasets, making it unique in the context of unsupervised learning with deep architectures. Our experiments show that our method significantly improves the clustering accuracy while enjoying cheaper memory footprints.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsNotes
- 1.
Among all the datasets that have been tested, COIL100 with 7,200 images seems to be the largest one.
- 2.
We assume \(p=p_i,\forall i\) in the remainder.
- 3.
In our experiments, the number of parameters in SAE is 2600 times more than that of CAE.
References
Abadi, M., et al.: TensorFlow: large-scale machine learning on heterogeneous distributed systems. arXiv:1603.04467 (2016)
Agarwal, P.K., Mustafa, N.H.: K-means projective clustering. In: Proceedings of the Twenty-Third ACM SIGMOD-SIGACT-SIGART Symposium on Principles of Database Systems, pp. 155–165. ACM (2004)
Balzano, L., Szlam, A., Recht, B., Nowak, R.: K-subspaces with missing data. In: 2012 IEEE Statistical Signal Processing Workshop (SSP), pp. 612–615. IEEE (2012)
Bengio, Y., Lamblin, P., Popovici, D., Larochelle, H.: Greedy layer-wise training of deep networks. In: NIPS, pp. 153–160 (2007)
Bradley, P.S., Mangasarian, O.L.: K-plane clustering. J. Glob. Optim. 16(1), 23–32 (2000)
Dalal, N., Triggs, B.: Histograms of oriented gradients for human detection. In: CVPR 2005, pp. 886–893. IEEE (2005)
Elhamifar, E., Vidal, R.: Sparse subspace clustering. In: CVPR, pp. 2790–2797 (2009)
Elhamifar, E., Vidal, R.: Sparse subspace clustering: algorithm, theory, and applications. IEEE Trans Pattern Anal. Mach. Intell. 35(11), 2765–2781 (2013)
Ho, J., Yang, M.H., Lim, J., Lee, K.C., Kriegman, D.: Clustering appearances of objects under varying illumination conditions. In: CVPR, vol. 1, pp. 11–18. IEEE (2003)
Ji, P., Li, H., Salzmann, M., Dai, Y.: Robust motion segmentation with unknown correspondences. In: Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T. (eds.) ECCV 2014. LNCS, vol. 8694, pp. 204–219. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-10599-4_14
Ji, P., Li, H., Salzmann, M., Zhong, Y.: Robust multi-body feature tracker: a segmentation-free approach. In: CVPR, pp. 3843–3851 (2016)
Ji, P., Reid, I., Garg, R., Li, H., Salzmann, M.: Low-rank kernel subspace clustering. arXiv preprint arXiv:1707.04974 (2017)
Ji, P., Salzmann, M., Li, H.: Efficient dense subspace clustering. In: IEEE Winter Conference on Applications of Computer Vision (WACV), pp. 461–468. IEEE (2014)
Ji, P., Salzmann, M., Li, H.: Shape interaction matrix revisited and robustified: efficient subspace clustering with corrupted and incomplete data. In: ICCV, pp. 4687–4695 (2015)
Ji, P., Zhang, T., Li, H., Salzmann, M., Reid, I.: Deep subspace clustering networks. In: Advances in Neural Information Processing Systems, pp. 23–32 (2017)
Ji, P., Zhong, Y., Li, H., Salzmann, M.: Null space clustering with applications to motion segmentation and face clustering. In: ICIP, pp. 283–287. IEEE (2014)
Kanatani, K.: Motion segmentation by subspace separation and model selection. In: ICCV, vol. 2, pp. 586–591. IEEE (2001)
Kingma, D., Ba, J.: Adam: a method for stochastic optimization. arXiv:1412.6980 (2014)
LeCun, Y., Bottou, L., Bengio, Y., Haffner, P.: Gradient-based learning applied to document recognition. Proc. IEEE 86(11), 2278–2324 (1998)
Lee, K.C., Ho, J., Kriegman, D.J.: Acquiring linear subspaces for face recognition under variable lighting. TPAMI 27(5), 684–698 (2005)
Liu, G., Lin, Z., Yan, S., Sun, J., Yu, Y., Ma, Y.: Robust recovery of subspace structures by low-rank representation. IEEE Trans Pattern Anal. Mach. Intell. 35(1), 171–184 (2013)
Lloyd, S.: Least squares quantization in PCM. IEEE Trans. Inf. Theory 28(2), 129–137 (1982)
Lowe, D.G.: Distinctive image features from scale-invariant keypoints. IJCV 60(2), 91–110 (2004)
Ma, Y., Derksen, H., Hong, W., Wright, J.: Segmentation of multivariate mixed data via lossy data coding and compression. TPAMI 29(9), 1546–1562 (2007)
Ng, A.Y., Jordan, M.I., Weiss, Y., et al.: On spectral clustering: analysis and an algorithm. In: NIPS, vol. 14, pp. 849–856 (2001)
Patel, V.M., Vidal, R.: Kernel sparse subspace clustering. In: ICIP, pp. 2849–2853. IEEE (2014)
Peng, X., Feng, J., Xiao, S., Lu, J., Yi, Z., Yan, S.: Deep sparse subspace clustering. arXiv preprint arXiv:1709.08374 (2017)
Peng, X., Xiao, S., Feng, J., Yau, W.Y., Yi, Z.: Deep subspace clustering with sparsity prior. In: IJCAI (2016)
Shi, J., Malik, J.: Normalized cuts and image segmentation. TPAMI 22(8), 888–905 (2000)
Tseng, P.: Nearest q-Flat to m points. J. Optim. Theory Appl. 105(1), 249–252 (2000)
Vidal, R.: Subspace clustering. IEEE Signal Process. Mag. 28(2), 52–68 (2011)
Vidal, R., Favaro, P.: Low rank subspace clustering (LRSC). Pattern Recognit. Lett. 43, 47–61 (2014)
Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: a novel image dataset for benchmarking machine learning algorithms (2017)
Xiao, S., Tan, M., Xu, D., Dong, Z.Y.: Robust kernel low-rank representation. IEEE Trans. Neural Netw. Learn. Syst. 27(11), 2268–2281 (2016)
Xie, J., Girshick, R., Farhadi, A.: Unsupervised deep embedding for clustering analysis. In: International Conference on Machine Learning, pp. 478–487 (2016)
Yang, A.Y., Wright, J., Ma, Y., Sastry, S.S.: Unsupervised segmentation of natural images via lossy data compression. CVIU 110(2), 212–225 (2008)
Yang, B., Fu, X., Sidiropoulos, N.D., Hong, M.: Towards k-means-friendly spaces: simultaneous deep learning and clustering. In: ICML, pp. 3861–3870 (2017)
Yin, M., Guo, Y., Gao, J., He, Z., Xie, S.: Kernel sparse subspace clustering on symmetric positive definite manifolds. In: CVPR, pp. 5157–5164 (2016)
You, C., Li, C.G., Robinson, D.P., Vidal, R.: Oracle based active set algorithm for scalable elastic net subspace clustering. In: CVPR, pp. 3928–3937 (2016)
You, C., Robinson, D., Vidal, R.: Scalable sparse subspace clustering by orthogonal matching pursuit. In: CVPR, pp. 3918–3927 (2016)
Zhang, T., Szlam, A., Wang, Y., Lerman, G.: Hybrid linear modeling via local best-fit flats. Int. J. Comput. Vis. 100(3), 217–240 (2012)
Acknowledgement
This research was supported by Australian Research Council (ARC) Discovery Projects funding scheme (project DP150104645), ARC through Laureate Fellowship FL130100102 to IDR and ARC of Excellence for Robotic Vision (project number CE140100016).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Zhang, T., Ji, P., Harandi, M., Hartley, R., Reid, I. (2019). Scalable Deep k-Subspace Clustering. In: Jawahar, C., Li, H., Mori, G., Schindler, K. (eds) Computer Vision – ACCV 2018. ACCV 2018. Lecture Notes in Computer Science(), vol 11365. Springer, Cham. https://doi.org/10.1007/978-3-030-20873-8_30
Download citation
DOI: https://doi.org/10.1007/978-3-030-20873-8_30
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-20872-1
Online ISBN: 978-3-030-20873-8
eBook Packages: Computer ScienceComputer Science (R0)