Skip to main content

Constrained Grouped Sparsity

  • Conference paper
AI 2012: Advances in Artificial Intelligence (AI 2012)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 7691))

Included in the following conference series:

Abstract

In this paper, we propose an algorithm encouraging group sparsity under some convex constraint. It stems from some applications where the regression coefficients are subject to constraints, for example nonnegativity and the explanatory variables are not suitable to be orthogonalized within groups. It takes the form of the group LASSO based on linear regression model where a L1/L2 norm is imposed on group coefficients to achieve group sparsity. It differs from the original group LASSO in the following ways. First, the regression coefficients must obey some convex constraints. Second, there is no requirement for orthogonality of the variables within individual groups. For these reasons, the simple blockwise coordinate descent for all group coefficients is no longer applicable and a special treatment for the constraint is necessary. The algorithm we proposed in this paper is an alternating direction method, and both exact and inexact solutions are provided. The inexact version simplifies the computation while retaining practical convergence. As an approximation to group L0, it can be applied to data analysis where group fitting is essential and the coefficients are constrained. It may serve as a screening procedure to reduce the number of the groups when the number of total groups is prohibitively high.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Berman, M., Bischof, L., Lagerstron, R., Guo, Y., Huntington, J., Mason, P.: An unmixing algorithm based on a large library of shortwave infrared spectra. Technical Report EP117468, CMIS (2011)

    Google Scholar 

  2. Khana, J.A., Aelstb, S.V., Zamara, R.H.: Building a robust linear model with forward selection and stepwise procedures. Computational Statistics and Data Analysis 52, 239–248 (2007)

    Article  MathSciNet  Google Scholar 

  3. Hofmann, M., Gatu, C., Kontoghiorghes, E.J.: Efficient algorithms for computing the best subset regression models for large-scale problems. Computational Statistics and Data Analysis 52(1), 16–29 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  4. Cho, S.J., Hermsmeier, M.A.: Genetic algorithm guided selection: Variables selection and subset selection. Journal of Chemical Information and Computer Sciences, 927–936 (2002)

    Google Scholar 

  5. Miller, A.: Subset Selection in Regression, 2nd edn. Monographs on Statistics and Applied Probability, vol. 95. Chapman & Hall/CRC (2002)

    Google Scholar 

  6. Guo, Y., Berman, M., Gao, J.: Grouped subset selection for least squares regression (in preparation, 2012)

    Google Scholar 

  7. Tibshirani, R.: Regression shrinkage and selection via the Lasso. Journal of Royoal Statistical Society 58(1), 267–288 (1996)

    MathSciNet  MATH  Google Scholar 

  8. Friedman, J.H., Hastie, T., Tibshirani, R.: Regularization paths for generalized linear models via coordinate descent. Journal of Statistical Software 33(1), 1–22 (2010)

    Google Scholar 

  9. Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM Journal on Imaging Sciences 2(1), 183–202 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  10. Hastie, T., Rosset, S., Tibshirani, R., Zhu, J.: The entire regularization path for the support vector machine. Journal of Machine Learning Research 5, 1391–1415 (2004)

    MathSciNet  MATH  Google Scholar 

  11. Efron, B., Hastie, T., Johnstone, I., Tibshirani, R.: Least angle regression. Annals of Statistics 32, 407–499 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  12. Friedman, J.: Fast sparse regression and classification. In: The 23rd International Workshop on Statistical Modelling, pp. 27–57 (2008)

    Google Scholar 

  13. Yuan, M., Lin, Y.: Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 68(1), 49–67 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  14. Liu, H., Palatucci, M., Zhang, J.: Blockwise coordinate descent procedures for the multi-task lasso, with applications to neural semantic basis discovery. In: The Proceedings of 26th International Conference of Machine Learning (2009)

    Google Scholar 

  15. Yang, H., Xu, Z., King, I., Lyu, M.R.: Online learning for group lasso. In: The Proceedings of 27th International Conference of Machine Learning (2010)

    Google Scholar 

  16. Friedman, J., Hastie, T., Tibshirani, R.: A note on the group lasso and a sparse group lasso. Technical report, Standford University (2010)

    Google Scholar 

  17. Boyd, S., Vandenberghe, L.: Convex Optimization. Cambridge University Press (2004)

    Google Scholar 

  18. Lin, Z., Liu, R., Su, Z.: Linearized alternating direction method with adaptive penalty for low rank representation. In: NIPS (2011)

    Google Scholar 

  19. Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM Journal of Imaging Sciences 2(1), 183–202 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  20. Lin, Z., Chen, M., Ma, Y.: The augmented lagrange multiplier method for exact recovery of corrupted low-rank matrices. Technical Report UILU-ENG-09-2215, University of Illinois at Urbana-Champaign (2009)

    Google Scholar 

  21. Berman, M., Bischof, L., Huntington, J.: Algorithms and software for the automated identification of minerals using field spectra or hyperspectral imagery. In: Thirteenth International Conference on Applied Geologic Remote Sensing (1999)

    Google Scholar 

  22. Guo, Y., Berman, M.: A comparison between subset selection and L1 regularisation with an application in spectroscopy. Chemometrics and Intelligent Laboratory Systems (to appear)

    Google Scholar 

  23. Guo, Y., Berman, M.: An investigation of some regularization approaches to the unmixing of mineral reflectance spectra. Technical Report CMIS 09/119, CMIS (2010)

    Google Scholar 

  24. Tibshirani, R., Bien, J., Friedman, J., Hastie, T., Simon, N., Taylor, J., Tibshirani, R.J.: Strong rules for discarding predictors in lasso-type problems. Journal of the Royal Statistical Society, Series B 74, 245–266 (2012)

    Article  MathSciNet  Google Scholar 

  25. Fan, J., Lv, J.: Sure independence screening for ultrahigh dimensional feature space. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 70, 849–911 (2008)

    Article  MathSciNet  Google Scholar 

  26. Mohimani, H., Babaie-Zadeh, M., Jutten, C.: A fast approach for overcomplete sparse decomposition based on smoothed l0 norm. IEEE Transactions on Signal Processing 57(1), 289–301 (2009)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Guo, Y., Gao, J., Hong, X. (2012). Constrained Grouped Sparsity. In: Thielscher, M., Zhang, D. (eds) AI 2012: Advances in Artificial Intelligence. AI 2012. Lecture Notes in Computer Science(), vol 7691. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-35101-3_37

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-35101-3_37

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-35100-6

  • Online ISBN: 978-3-642-35101-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics