2D compressed learning: support matrix machine with bilinear random projections
- 283 Downloads
Support matrix machine (SMM) is an efficient matrix classification method that can leverage the structure information within the matrix to improve the classification performance. However, its computational and storage costs are still expensive for high-dimensional data. To address these problems, in this paper, we consider a 2D compressed learning paradigm to learn the SMM classifier in some compressed data domain. Specifically, we use the Kronecker compressed sensing (KCS) to obtain the compressive measurements and learn the SMM classifier. We show that the Kronecker product measurement matrices used by KCS satisfies the restricted isometry property (RIP), which is a property to ensure the learnability of the compressed data. We further give a lower bound on the number of measurements required for KCS. Though this lower bound shows that KCS requires more measurements than the regular CS to satisfy the same RIP condition, KCS itself still enjoys lower computational and storage complexities. Then, using the RIP condition, we verify that the learned SMM classifier in the compressed domain can perform almost as well as the best linear classifier in the original uncompressed domain. Finally, our experimental results also demonstrate the feasibility of 2D compressed learning.
Keywords2D compressed learning Bilinear random projection Dimension reduction Support matrix machine Kronecker compressed learning
We would like to express our appreciation to the editors and the reviewers, who have greatly helped us in improving the quality of the paper. This work is supported by the National Natural Science Foundation of China (NSFC) under Grant No. 61672281 and the Key Program of NSFC under Grant No. 61732006.
- Calderbank, R., Jafarpour, S., & Schapire, R. (2009). Compressed learning: Universal sparse dimensionality reduction and learning in the measurement domain. Technical report, Rice UniversityGoogle Scholar
- Filannino, M. (2011). Dbworld e-mail classification using a very small corpus. The University of Manchester.Google Scholar
- Jokar, S., & Mehrmann, V. (2012). Sparse representation of solutions of kronecker product systems. MathematicsGoogle Scholar
- Luo, L., Xie, Y., Zhang, Z., & Li, W. J. (2015). Support matrix machines. In Proceedings of the 32nd international conference on machine learning (ICML-15) (pp. 938–947).Google Scholar
- Maillard, O., & Munos, R. (2009). Compressed least-squares regression. In Advances in neural information processing systems (pp. 1213–1221).Google Scholar
- Reboredo, H., Renna, F., Calderbank, R., & Rodrigues, M. R. (2013). Compressive classification. In 2013 IEEE international symposium on information theory proceedings (ISIT) (pp. 674–678). IEEEGoogle Scholar
- Wolf, L., Jhuang, H., & Hazan, T. (2007). Modeling appearances with low-rank SVM. In IEEE conference on computer vision and pattern recognition, 2007. CVPR’07 (pp. 1–6). IEEEGoogle Scholar