Skip to main content

A Smoothing Descent Method for Nonconvex TV\(^q\)-Models

  • Conference paper
  • First Online:
Efficient Algorithms for Global Optimization Methods in Computer Vision

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 8293))

Abstract

A novel class of variational models with nonconvex \(\ell _q\)-norm-type regularizations (\(0<q<1\)) is considered, which typically outperforms popular models with convex regularizations in restoring sparse images. Due to the fact that the objective function is nonconvex and non-Lipschitz, such models are very challenging from an analytical as well as numerical point of view. In this work a smoothing descent method with provable convergence properties is proposed for computing stationary points of the underlying variational problem. Numerical experiments are reported to illustrate the effectiveness of the new method.

This research was supported by the Austrian Science Fund (FWF) through START project Y305 “Interfaces and Free Boundaries” and through SFB project F3204 “Mathematical Optimization and Applications in Biomedical Sciences”.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 34.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 44.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Burke, J.V., Lewis, A.S., Overton, M.L.: A robust gradient sampling algorithm for nonsmooth, nonconvex optimization. SIAM J. Optim. 15, 751–779 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  2. Chan, T.F., Mulet, P.: On the convergence of the lagged diffusivity fixed point method in total variation image restoration. SIAM J. Numer. Anal. 36, 354–367 (1999)

    Article  MathSciNet  Google Scholar 

  3. Chartrand, R.: Exact reconstruction of sparse signals via nonconvex minimization. IEEE Signal Process. Lett. 14, 707–710 (2007)

    Article  Google Scholar 

  4. Chartrand, R.: Fast algorithms for nonconvex compressive sensing: MRI reconstruction from very few data. In: Proceedings of the IEEE International Symposium on Biomedical Imaging, pp. 262–265 (2009)

    Google Scholar 

  5. Chartrand, R., Yin, W.: Iteratively reweighted algorithms for compressive sensing. In: Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing, pp. 3869–3872 (2008)

    Google Scholar 

  6. Chen, X., Zhou, W.: Smoothing nonlinear conjugate gradient method for image restoration using nonsmooth nonconvex minimization. SIAM J. Imaging Sci. 3, 765–790 (2010)

    Article  MATH  MathSciNet  Google Scholar 

  7. Clarke, F.H.: Optimization and Nonsmooth Analysis. John Wiley & Sons, New York (1983)

    MATH  Google Scholar 

  8. Daubechies, I.: Ten Lectures on Wavelets. SIAM, Philadelphia (1992)

    Book  MATH  Google Scholar 

  9. Daubechies, I., DeVore, R., Fornasier, M., Güntürk, C.: Iteratively reweighted least squares minimization for sparse recovery. Comm. Pure Appl. Math. 63, 1–38 (2010)

    Article  MATH  MathSciNet  Google Scholar 

  10. Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal. 23, 707–716 (1986)

    Article  MATH  MathSciNet  Google Scholar 

  11. Hintermüller, M., Stadler, G.: An infeasible primal-dual algorithm for total bounded variation-based inf-convolution-type image restoration. SIAM J. Sci. Comput. 28, 1–23 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  12. Hintermüller, M., Wu, T.: Nonconvex TV\(^q\)-models in image restoration: analysis and a trust-region regularization based superlinearly convergent solver. SIAM J. Imaging Sciences (to appear)

    Google Scholar 

  13. Kiwiel, K.C.: Convergence of the gradient sampling algorithm for nonsmooth nonconvex optimization. SIAM J. Optim. 18, 379–388 (2007)

    Article  MATH  MathSciNet  Google Scholar 

  14. Mäkelä, M., Neittaanmäki, P.: Nonsmooth Optimization: Analysis and Algorithms with Applications to Optimal Control. World Scientific, River Edge (1992)

    Book  MATH  Google Scholar 

  15. Nikolova, M., Ng, M.K., Tam, C.-P.: Fast nonconvex nonsmooth minimization methods for image restoration and reconstruction. IEEE Trans. Image Process. 19, 3073–3088 (2010)

    Article  MathSciNet  Google Scholar 

  16. Nikolova, M., Ng, M.K., Zhang, S., Ching, W.-K.: Efficient reconstruction of piecewise constant images using nonsmooth nonconvex minimization. SIAM J. Imaging Sci. 1, 2–25 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  17. Nocedal, J., Wright, S.: Numerical Optimization, 2nd edn. Springer, New York (2006)

    MATH  Google Scholar 

  18. Rudin, L., Osher, S., Fatemi, E.: Nonlinear total variation based noise removal algorithms. Physica D 60, 259–268 (1992)

    Article  MATH  Google Scholar 

  19. Schramm, H., Zowe, J.: A version of the bundle idea for minimizing a nonsmooth function: Conceptual idea, convergence analysis, numerical results. SIAM J. Optim. 2, 121–152 (1992)

    Article  MATH  MathSciNet  Google Scholar 

  20. Vogel, C.R., Oman, M.E.: Iterative methods for total variation denoising. SIAM J. Sci. Comput. 17, 227–238 (1996)

    Article  MATH  MathSciNet  Google Scholar 

Download references

Acknowledgement

The authors would like to thank Dr. Florian Knoll for contributing his data and codes to our experiments on MRI.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Michael Hintermüller .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Hintermüller, M., Wu, T. (2014). A Smoothing Descent Method for Nonconvex TV\(^q\)-Models. In: Bruhn, A., Pock, T., Tai, XC. (eds) Efficient Algorithms for Global Optimization Methods in Computer Vision. Lecture Notes in Computer Science(), vol 8293. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-54774-4_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-54774-4_6

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-54773-7

  • Online ISBN: 978-3-642-54774-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics