Skip to main content

MAP versus MMSE Estimation

  • Chapter
  • First Online:
Sparse and Redundant Representations
  • 9482 Accesses

Abstract

So far we kept the description of the pursuit algorithms on a deterministic level, as an intuitive optimization procedure. We mentioned in Chapter 9 that these algorithms correspond to an approximation of the Maximum-A’posteriori-Probability (MAP) estimator, but this connection was not explicitly derived. In this chapter we make this claim exact by defining the quest for sparse representations as an estimation task. As we shall see, this calls for a clear and formal definition of the stochastic model assumed to generate the sparse representation vector. A benefit of such treatment is an ability to derive the Minimum-Mean-Squared-Error (MMSE) estimator as well, and this in turn leads to the need to approximate it. These and more are the topics we cover in this chapter.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 49.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 64.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 99.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Further Reading

  1. F. Abramovich, T. Sapatinas and B.W. Silverman, Wavelet thresholding via a Bayesian approach, J. R. Statist. Soc. B, 60:725–749, 1998.

    Article  MATH  MathSciNet  Google Scholar 

  2. A. Antoniadis, J. Bigot, and T. Sapatinas, Wavelet estimators in nonparametric regression: a comparative simulation study, J. Stat. Software, 6(6):1–83, 2001.

    Google Scholar 

  3. M. Clyde and E.I. George, Empirical Bayes estimation in wavelet nonparametric regression. In Bayesian Inference in Wavelet Based Models, P. Muller and B. Vidakovic (Eds.), Lect. Notes Statist., 141:309–322, New York, Springer-Verlag, 1998.

    Google Scholar 

  4. M. Clyde and E.I. George, Flexible empirical Bayes estimation for wavelets, J. R. Statist. Soc. B, 62:681–698, 2000.

    Article  MATH  MathSciNet  Google Scholar 

  5. M. Clyde, G. Parmigiani and B. Vidakovic, Multiple shrinkage and subset selection in wavelets, Biometrika, 85:391–401, 1998.

    Article  MATH  MathSciNet  Google Scholar 

  6. M. Elad and I. Yavneh, A weighted average of sparse representations is better than the sparsest one alone, IEEE Transactions on Information Theory, 55(10):4701–4714, October 2009.

    Article  MathSciNet  Google Scholar 

  7. J. Turek, I. Yavneh, M. Protter, and M. Elad, On MMSE and MAP denoising under sparse representation modeling over a unitary dictionary, submitted to Applied Computational Harmonic Analysis.

    Google Scholar 

  8. E. Larsson and Y. Selen, Linear regression with a sparse parameter vector, IEEE Transactions on Signal Processing, 55:451–460, 2007.

    Article  MathSciNet  Google Scholar 

  9. S. Mallat and Z. Zhang, Matching pursuits with time-frequency dictionaries, IEEE Trans. on Signal Processing, 41(12):3397–3415, 1993.

    Article  MATH  Google Scholar 

  10. P. Moulin and J. Liu, Analysis of multiresolution image denoising schemes using generalized Gaussian and complexity priors, IEEE Trans. Inf. Theory, 45(3):909–919, April 1999.

    Article  MATH  MathSciNet  Google Scholar 

  11. M. Protter, I. Yavneh and M. Elad, Closed-form MMSE for denoising signals under sparse-representation modelling, The IEEE 25-th Convention of Electrical and Electronics Engineers in Israel, Eilat Israel, December 3–5, 2008.

    Google Scholar 

  12. M. Protter, I. Yavneh and M. Elad, Closed-Form MMSE estimation for signal denoising under sparse representation modelling over a unitary dictionary, submitted to IEEE Trans. on Signal Processing.

    Google Scholar 

  13. E.P. Simoncelli and E.H. Adelson, Noise removal via Bayesian wavelet coring, in Proc. ICIP, Laussanne, Switzerland, pp. 379–382, September 1996.

    Google Scholar 

  14. P. Schnitter, L. C. Potter, and J. Ziniel, Fast Bayesian matching pursuit, Proc. Workshop on Information Theory and Applications (ITA), (La Jolla, CA), Jan. 2008.

    Google Scholar 

  15. P. Schintter, L.C. Potter, and J. Ziniel, Fast Bayesian matching pursuit: Model uncertainty and parameter estimation for sparse linear models, submitted to IEEE Transactions on Signal Processing.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Michael Elad .

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer Science+Business Media, LLC

About this chapter

Cite this chapter

Elad, M. (2010). MAP versus MMSE Estimation. In: Sparse and Redundant Representations. Springer, New York, NY. https://doi.org/10.1007/978-1-4419-7011-4_11

Download citation

Publish with us

Policies and ethics