Abstract
Hashing methods are efficient in dealing with large scale image retrieval problems. Current hashing methods, such as the orthogonal k-means, using coordinate descent algorithm to minimize quantization error usually yield unstable performance. It is because the coordinate descent algorithm only provides a local optimum solution. The orthogonal k-means develops a new model with a compositional parameterization of cluster centers to efficiently represent multiple centers. The objective of the orthogonal k-means is to minimize the quantization error by using the coordinate descent algorithm to find the optimal rotation, scaling and translation on descriptor vectors of images. The performance of the orthogonal k-means is dependent on the initialization of the rotation matrix. In this work, we propose the multiple ok-means hashing method to reduce the instability of performance of the orthogonal k-means hashing. For large scale retrieval problems, standard multiple hash tables methods using M tables require M times storage in comparison to single hash table schemes. We propose a binary code selection scheme to reduce the storage of the multiple orthogonal k-means to use the same size of storage as for single table’s. Experimental results show that the proposed method outperforms ok-mean using the same size of storage.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Arya, S., Mount, D.M., Netanyahu, N.S., Silverman, R., Wu, A.: An optimal algorithm for approximate nearest neighbor searching fixed dimensional. Journal of the ACM 45(6), 891–923 (1998)
Wu, C., Zhu, J., Cai, D., Chen, C., Bu, J.: Semi-supervised nonlinear hashing using bootstrap sequential projection learning. IEEE Transactions on Knowledge and Data Engineering 25(6), 1380–1393 (2013)
Gordo, A., Perronnin, F.: Asymmetric distances for binary embeddings. In: 2011 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 729–736 (2011)
Datar, M., Immorlica, N., Indyk, P., Mirrokni, V.S.: Locality-sensitive hashing scheme based on p-stable distributions. In: Proceedings of the Twentieth Annual Symposium on Computational Geometry, pp. 253–262. ACM (2004)
Gionis, A., Indyk, P., Motwani, R., et al.: Similarity search in high dimensions via hashing. In: Proceedings of the 25th VLDB Conference, Edinburgh, Scotland, pp. 518–529 (1999)
Andoni, A., Indyk, P.: Near-optimal hashing algorithms for approximate nearest neighbor in high dimensions. In: 47th Annual IEEE Symposium on Foundations of Computer Science, FOCS 2006, pp. 459–468 (2006)
Raginsky, M., Lazebnik, S.: Locality-Sensitive Binary codes from Shift-Invariant Kernels. In: Advances in Neural Information Processing Systems (NIPS), 22, pp. 2130–2137 (2009)
Gong, Y., Lazebnik, S., Iterative quantization: A procrustean approach to learning binary codes. In: 2011 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 817–824 (2011)
Gong, Y., Lazebnik, S., Gordo, A., Perronnin, F.: Iterative quantization: A procrustean approach to learning binary codes for large-scale image retrieval. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(12), 2916–2929 (2013)
Weiss, Y., Torralba, A., Fergus, R.: Spectral hashing. In: Advances in Neural Information Processing Systems (NIPS) 9, pp. 1753–1760 (2008)
Kulis, B., Darrell, T.: Learning to hash with binary reconstructive embeddings. In: Advances in Neural Information Processing Systems (NIPS) 22, pp. 1042–1050 (2009)
Norouzi, M., Blei, D.M.: Minimal loss hashing for compact binary codes. In: Proceedings of the 28th International Conference on Machine Learning (ICML-2011), pp. 353–360 (2011)
Liu, W., Wang, J., Ji, R., Jiang, Y.G., Chang, S.F.: Supervised hashing with kernels. In: 2012 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 2074–2081 (2012)
Lloyd, Stuart P.: Least squares quantization in pcm. IEEE Transactions on Information Theory 28(2), 129–136 (1982)
Norouzi, M., Fleet, D.J.: Cartesian k-means. In: 2013 IEEE Conference on Computer Vision and Pattern Recognition, pp. 3017–3024 (2013)
Xu, H., Wang, J., Li, Z., Zeng, G., Li, S., Yu, N.: Complementary hashing for approximate nearest neighbor search. In: 2011 IEEE International Conference on Computer Vision (ICCV), pp. 1631–1638 (2011)
Schönemann, P.: A generalized solution of the orthogonal procrustes problem. Psychometrika 31(1) (1966)
Jégou, H., Douze, M., Schmid, C.: Product quantization for nearest neighbor search. IEEE Transactions on Pattern Analysis and Machine Intelligence 33(1) (2011)
Krizhevsky, A.: Learning multiple layers of features from tiny images. MSc Thesis, Univ. Toronto (2009)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Zeng, Z., Lv, Y., Ng, W.W.Y. (2014). Multiple Orthogonal K-means Hashing. In: Wang, X., Pedrycz, W., Chan, P., He, Q. (eds) Machine Learning and Cybernetics. ICMLC 2014. Communications in Computer and Information Science, vol 481. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-45652-1_13
Download citation
DOI: https://doi.org/10.1007/978-3-662-45652-1_13
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-662-45651-4
Online ISBN: 978-3-662-45652-1
eBook Packages: Computer ScienceComputer Science (R0)