Skip to main content

Estimation Under Model-Based Sparsity

  • Chapter
  • First Online:
Algorithms for Sparsity-Constrained Optimization

Part of the book series: Springer Theses ((Springer Theses,volume 261))

  • 1167 Accesses

Abstract

Beyond the ordinary, extensively studied, plain sparsity model, a variety of structured sparsity models have been proposed in the literature Bach (2008); Roth and Fischer (2008); Jacob et al. (2009); Baraniuk et al. (2010); Bach (2010); Bach et al. (2012); Chandrasekaran et al. (2012); Kyrillidis and Cevher (2012a). These sparsity models are designed to capture the interdependence of the locations of the non-zero components that is known a priori in certain applications. For instance, the wavelet transform of natural images are often (nearly) sparse and the dependence among the dominant wavelet coefficients can be represented by a rooted and connected tree. Furthermore, in applications such as array processing or sensor networks, while different sensors may take different measurements, the support set of the observed signal is identical across the sensors. Therefore, to model this property of the system, we can compose an enlarged signal with jointly-sparse or block-sparse support set, whose non-zero coefficients occur as contiguous blocks.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • F. Bach. Structured sparsity-inducing norms through submodular functions. In Advances in Neural Information Processing Systems, volume 23, pages 118–126, Vancouver, BC, Canada, Dec. 2010.

    Google Scholar 

  • F. Bach, R. Jenatton, J. Mairal, and G. Obozinski. Structured sparsity through convex optimization. Statistical Science, 27(4):450–468, Nov. 2012.

    Article  MathSciNet  Google Scholar 

  • F. R. Bach. Consistency of the group Lasso and multiple kernel learning. Journal of Machine Learning Research, 9:1179–1225, June 2008.

    MathSciNet  MATH  Google Scholar 

  • R. G. Baraniuk, V. Cevher, M. F. Duarte, and C. Hegde. Model-based compressive sensing. IEEE Transactions on Information Theory, 56(4): 1982–2001, 2010.

    Article  MathSciNet  Google Scholar 

  • T. Blumensath. Compressed sensing with nonlinear observations. Preprint, 2010. URL http://users.fmrib.ox.ac.uk/~tblumens/papers/B_Nonlinear.pdf.

  • V. Chandrasekaran, B. Recht, P. A. Parrilo, and A. S. Willsky. The convex geometry of linear inverse problems. Foundations of Computational Mathematics, 12(6):805–849, Dec. 2012.

    Article  MathSciNet  MATH  Google Scholar 

  • M. Duarte and Y. Eldar. Structured compressed sensing: From theory to applications. IEEE Transactions on Signal Processing, 59(9):4053–4085, Sept. 2011.

    Article  MathSciNet  Google Scholar 

  • D. Hsu, S. Kakade, and T. Zhang. Tail inequalities for sums of random matrices that depend on the intrinsic dimension. Electron. Commun. Probab., 17(14):1–13, 2012.

    MathSciNet  Google Scholar 

  • L. Jacob, G. Obozinski, and J. Vert. Group Lasso with overlap and graph Lasso. In Proceedings of the 26th Annual International Conference on Machine Learning, ICML ’09, pages 433–440, New York, NY, USA, 2009.

    Google Scholar 

  • R. Jenatton, J. Audibert, and F. Bach. Structured variable selection with sparsity-inducing norms. Journal of Machine Learning Research, 12: 2777–2824, Oct. 2011.

    MathSciNet  Google Scholar 

  • A. Kyrillidis and V. Cevher. Combinatorial selection and least absolute shrinkage via the CLASH algorithm. arXiv:1203.2936 [cs.IT], Mar. 2012a.

    Google Scholar 

  • A. Kyrillidis and V. Cevher. Sublinear time, approximate model-based sparse recovery for all. arXiv:1203.4746 [cs.IT], Mar. 2012b.

    Google Scholar 

  • A. Lozano, G. Swirszcz, and N. Abe. Group orthogonal matching pursuit for logistic regression. In G. Gordon, D. Dunson, and M. Dudik, editors, the Fourteenth International Conference on Artificial Intelligence and Statistics, volume 15, pages 452–460, Ft. Lauderdale, FL, USA, 2011. JMLR W&CP.

    Google Scholar 

  • Y. C. Pati, R. Rezaiifar, and P. S. Krishnaprasad. Orthogonal matching pursuit: Recursive function approximation with applications to wavelet decomposition. In Conference Record of the 27th Asilomar Conference on Signals, Systems and Computers, volume 1, pages 40–44, Pacific Grove, CA, Nov. 1993.

    Google Scholar 

  • V. Roth and B. Fischer. The Group-Lasso for generalized linear models: Uniqueness of solutions and efficient algorithms. In Proceedings of the 25th International Conference on Machine learning, ICML ’08, pages 848–855, New York, NY, USA, 2008.

    Google Scholar 

  • A. Tewari, P. K. Ravikumar, and I. S. Dhillon. Greedy algorithms for structurally constrained high dimensional problems. In J. Shawe-Taylor, R. Zemel, P. Bartlett, F. Pereira, and K. Weinberger, editors, Advances in Neural Information Processing Systems, volume 24, pages 882–890. 2011.

    Google Scholar 

  • J. A. Tropp. User-friendly tail bounds for sums of random matrices. Foundations of Computational Mathematics, 12(4):389–434, Aug. 2012.

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Bahmani, S. (2014). Estimation Under Model-Based Sparsity. In: Algorithms for Sparsity-Constrained Optimization. Springer Theses, vol 261. Springer, Cham. https://doi.org/10.1007/978-3-319-01881-2_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-01881-2_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-01880-5

  • Online ISBN: 978-3-319-01881-2

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics