Skip to main content

Recovering Signals with Unknown Sparsity in Multiple Dictionaries

  • Chapter
  • First Online:
Compressed Sensing and its Applications

Part of the book series: Applied and Numerical Harmonic Analysis ((ANHA))

  • 1340 Accesses

Abstract

Motivated by the observation that a given signal x may admit sparse representations in multiple dictionaries Ψ d , but with varying levels of sparsity across dictionaries, we propose two new algorithms for signal reconstruction from noisy linear measurements. Our first algorithm extends the well-known basis pursuit denoising algorithm from the L1 regularizer ∥Ψx1 to composite regularizers of the form ∑ d λ d Ψ d x1 while self-adjusting the regularization weights λ d . Our second algorithm extends the well-known iteratively reweighted L1 algorithm to the same family of composite regularizers. For each algorithm, we provide several interpretations: i) majorization-minimization (MM) applied to a non-convex log-sum-type penalty, ii) MM applied to an approximate 0-type penalty, iii) MM applied to Bayesian MAP inference under a particular hierarchical prior, and iv) variational expectation-maximization (VEM) under a particular prior with deterministic unknown parameters.A detailed numerical study suggests that, when compared to their non-composite counterparts, our composite algorithms yield significantly improvements in accuracy with only modest increases in computational complexity.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Although (5) is over-parameterized, the form of (5) is convenient for algorithm development.

  2. 2.

    Matlab codes for [14] are provided at https://github.com/basp-group/sopt.

References

  1. M.V. Afonso, J.M. Bioucas-Dias, M.A.T. Figueiredo, Fast image recovery using variable splitting and constrained optimization. IEEE Trans. Image Process. 19(9), 2345–2356 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  2. R. Ahmad, P. Schniter, Iteratively reweighted 1 approaches to sparse composite regularization. IEEE Trans. Comput. Imaging 10(2), 220–235 (2015)

    Article  MathSciNet  Google Scholar 

  3. R. Ahmad, H. Xue, S. Giri, Y. Ding, J. Craft, O.P. Simonetti, Variable density incoherent spatiotemporal acquisition (VISTA) for highly accelerated cardiac MRI. Magn. Reson. Med. 1266–1278 (2014). https://doi.org/10.1002/mrm.25507

  4. S.D. Babacan, S. Nakajima, M.N. Do, Bayesian group-sparse modeling and variational inference. IEEE Trans. Signal Process. 62(11), 2906–2921 (2014)

    Article  MathSciNet  Google Scholar 

  5. S. Becker, J. Bobin, E.J. Candès, NESTA: a fast and accurate first-order method for sparse recovery. SIAM J. Imaging Sci. 4(1), 1–39 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  6. M. Belge, M.E. Kilmer, E.L. Miller, Efficient determination of multiple regularization parameters in a generalized L-curve framework. Inverse Prob. 18(4), 1161–1183 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  7. J.O. Berger, Statistical Decision Theory and Bayesian Analysis (Springer, New York, 1985)

    Book  MATH  Google Scholar 

  8. C.M. Bishop, Pattern Recognition and Machine Learning (Springer, New York, 2007)

    MATH  Google Scholar 

  9. J.M. Borwein, A.S. Lewis, Convex Analysis and Nonlinear Optimization (Springer, New York, 2006)

    Book  MATH  Google Scholar 

  10. S. Boyd, N. Parikh, E. Chu, B. Peleato, J. Eckstein, Distributed optimization and statistical learning via the alternating direction method of multipliers. Found. Trends Mach. Learn. 3(1), 1–122 (2010)

    Article  MATH  Google Scholar 

  11. C. Brezinski, M. Redivo-Zaglia, G. Rodriguez, S. Seatzu, Multi-parameter regularization techniques for ill-conditioned linear systems. Numer. Math. 94(2), 203–228 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  12. E.J. Candès, M.B. Wakin, An introduction to compressive sampling. IEEE Signal Process. Mag. 25(2), 21–30 (2008)

    Article  Google Scholar 

  13. E.J. Candès, M.B. Wakin, S. Boyd, Enhancing sparsity by reweighted 1 minimization. J. Fourier Anal. Appl. 14(5), 877–905 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  14. R.E. Carrillo, J.D. McEwen, D. Van De Ville, J.P. Thiran, Y. Wiaux, Sparsity averaging for compressive imaging. IEEE Signal Process. Lett. 20(6), 591–594 (2013)

    Article  Google Scholar 

  15. V. Cevher, Learning with compressible priors, in Proceedings of Neural Information Processing Systems Conference, Vancouver, BC (2009), pp. 261–269

    Google Scholar 

  16. R. Chartrand, W. Yin, Iteratively reweighted algorithms for compressive sensing, in Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing, Las Vegas, NV (2008), pp. 3869–3872

    Google Scholar 

  17. S.S. Chen, D.L. Donoho, M.A. Saunders, Atomic decomposition by basis pursuit. SIAM J. Sci. Comput. 20(1), 33–61 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  18. P.L. Combettes, J.C. Pesquet, A Douglas-Rachford splitting approach to nonsmooth convex variational signal recovery. IEEE J. Sel. Top. Sign. Proces. 1(4), 6564–574 (2007)

    Article  Google Scholar 

  19. I. Daubechies, R. DeVore, M. Fornasier, C.S. Güntürk, Iteratively reweighted least squares minimization for sparse recovery. Commun. Pure Appl. Math. 63(1), 1–38 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  20. A. Dempster, N.M. Laird, D.B. Rubin, Maximum-likelihood from incomplete data via the EM algorithm. J. R. Stat. Soc. 39, 1–17 (1977)

    MathSciNet  MATH  Google Scholar 

  21. M. Elad, P. Milanfar, R. Rubinstein, Analysis versus synthesis in signal priors. Inverse Prob. 23, 947–968 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  22. Y.C. Eldar, G. Kutyniok, Compressed Sensing: Theory and Applications (Cambridge University Press, New York, 2012)

    Book  Google Scholar 

  23. M.A. Figueiredo, Adaptive sparseness for supervised learning. IEEE Trans. Pattern Anal. Mach. Intell. 25(9), 1150–1159 (2003)

    Article  Google Scholar 

  24. M.A.T. Figueiredo, R.D. Nowak, Wavelet-based image estimation: an empirical Bayes approach using Jeffreys’ noninformative prior. IEEE Trans. Image Process. 10(9), 1322–1331 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  25. M.A.T. Figueiredo, R.D. Nowak, Majorization-minimization algorithms for wavelet-based image restoration. IEEE Trans. Image Process. 16(12), 2980–2991 (2007)

    Article  MathSciNet  Google Scholar 

  26. M. Fornasier, V. Naumova, S.V. Pereverzyev, Multi-parameter regularization techniques for ill-conditioned linear systems. SIAM J. Numer. Anal. 52(4), 1770–1794 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  27. S. Foucart, H. Rauhut, A Mathematical Introduction to Compressive Sensing (Birkhäuser, New York, 2013)

    Book  MATH  Google Scholar 

  28. S. Gazzola, P. Novati, Multi-parameter Arnoldi-Tikhonov methods. Electron. Trans. Numer. Anal. 40, 452–475 (2013)

    MathSciNet  MATH  Google Scholar 

  29. D.R. Hunter, K. Lange, A tutorial on MM algorithms. Am. Stat. 58(1), 30–37 (2004)

    Article  MathSciNet  Google Scholar 

  30. M.A. Khajehnejad, M. Amin, W. Xu, A.S. Avestimehr, B. Hassibi, Improved sparse recovery thresholds with two-step reweighted 1 minimization, in Proceedings of the IEEE International Symposium on Information Theory (2010), pp. 1603–1607

    Google Scholar 

  31. M. Kowalski, Sparse regression using mixed norms. Appl. Comput. Harmon. Anal. 27(2), 303–324 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  32. K. Kunisch, T. Pock, A bilevel optimization approach for parameter learning in variational models. SIAM J. Imag. Sci. 6(2), 938–983 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  33. S. Lu, S.V. Pereverzev, Regularization Theory for Ill-Posed Problems (Walter de Gruyter, Berlin, 2013)

    Book  MATH  Google Scholar 

  34. J. Mairal, Optimization with first-order surrogate functions, in Proceeding International Conference on Machine Learning, vol. 28 (2013), pp. 783–791

    Google Scholar 

  35. J. Mairal, F. Bach, J. Ponce, Sparse modeling for image and vision processing. Found. Trends Comput. Vis. 8(2–3), 85–283 (2014)

    Article  MATH  Google Scholar 

  36. R. Neal, G. Hinton, A view of the EM algorithm that justifies incremental, sparse, and other variants, in Learning in Graphical Models, ed. by M.I. Jordan (MIT Press, Cambridge, MA, 1998), pp. 355–368

    Chapter  Google Scholar 

  37. J.P. Oliveira, J.M. Bioucas-Dias, M.A.T. Figueiredo, Adaptive total variation image deblurring: a majorization-minimization approach. Signal Process. 89(9), 1683–1693 (2009)

    Article  MATH  Google Scholar 

  38. H.V. Poor, An Introduction to Signal Detection and Estimation, 2nd edn. (Springer, New York, 1994)

    Book  MATH  Google Scholar 

  39. G. Puy, P. Vandergheynst, R. Gribonval, Y. Wiaux, Universal and efficient compressed sensing by spread spectrum and application to realistic Fourier imaging techniques. EURASIP J. Appl. Signal Process. 2012(6), 1–13 (2012)

    Google Scholar 

  40. A. Rakotomamonjy, Surveying and comparing simultaneous sparse approximation (or group-lasso) algorithms. Signal Process. 91, 1505–1526 (2011)

    Article  MATH  Google Scholar 

  41. B.D. Rao, K. Kreutz-Delgado, An affine scaling methodology for best basis selection. IEEE Trans. Signal Process. 47, 187–200 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  42. L.I. Rudin, S. Osher, E. Fatemi, Nonlinear total variation based noise removal algorithms. Physica D 60, 259–268 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  43. Z. Tan, Y. Eldar, A. Beck, A. Nehorai, Smoothing and decomposition for analysis sparse recovery. IEEE Trans. Signal Process. 62(7), 1762–1774 (2014)

    Article  MathSciNet  Google Scholar 

  44. R. Tibshirani, Regression shrinkage and selection via the lasso. J. R. Stat. Soc. B 58(1), 267–288 (1996)

    MathSciNet  MATH  Google Scholar 

  45. R.J. Tibshirani, Solution path of the generalized lasso. Ann. Stat. 39(3), 1335–1371 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  46. D. Wipf, S. Nagarajan, Iterative reweighted 1 and 2 methods for finding sparse solutions. IEEE J. Sel. Top. Sign. Process. 4(2), 317–329 (2010)

    Article  Google Scholar 

  47. P. Xu, Y. Fukuda, Y. Liu, Multiple parameter regularization: numerical solutions and applications to the determination of geopotential from precise satellite orbits. J. Geodesy 80(1), 17–27 (2006)

    Article  Google Scholar 

Download references

Acknowledgements

This work has been supported in part by NSF grants CCF-1218754 and CCF-1018368 and DARPA grant N66001-11-1-4090 and NIH grant R01HL135489. An early version of this work was presented at the 2015 ISMRM Annual Meeting and Exhibition.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Philip Schniter .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Ahmad, R., Schniter, P. (2017). Recovering Signals with Unknown Sparsity in Multiple Dictionaries. In: Boche, H., Caire, G., Calderbank, R., März, M., Kutyniok, G., Mathar, R. (eds) Compressed Sensing and its Applications. Applied and Numerical Harmonic Analysis. Birkhäuser, Cham. https://doi.org/10.1007/978-3-319-69802-1_5

Download citation

Publish with us

Policies and ethics