Standard Algorithms

  • Bogdan Dumitrescu
  • Paul Irofti
Chapter

Abstract

There are many dictionary learning algorithms and this chapter is devoted to the most important ones solving the basic problem of optimizing the representation error. The most successful approaches rely on alternate optimization, iteratively solving sparse coding and dictionary update problems. OMP is often the choice for sparse coding, and we present a batch version that is efficient in the DL context, when many signals have to be represented at the same time. The dictionary update step makes the main difference between algorithms. We start with the oldest and simplest method, Sparsenet, which uses gradient descent, and then describe the block coordinate descent idea. The most known DL algorithms, MOD and K-SVD, come next. Due to the DL complexity, we also explore parallel versions of the coordinate descent algorithms, where more atoms are updated simultaneously and not sequentially, as usual. Other algorithms, like SimCO and NSGK, appeal to different viewpoints or subtle modifications of the classic approaches. After acquiring information about all these algorithms, we discuss implementation issues that may have a great effect on the result. Then, we attempt comparisons of the algorithms, examining especially their numerical behavior on test problems. It results that the competition between the algorithms is quite tight and that many of them are good candidates for practical use.

References

  1. 4.
    M. Aharon, M. Elad, A. Bruckstein, K-SVD: an algorithm for designing overcomplete dictionaries for sparse representation. IEEE Trans. Signal Process. 54(11), 4311–4322 (2006)Google Scholar
  2. 34.
    S. Chatterjee, M. Vehkapera, M. Skoglund, Projection-based and look-ahead strategies for atom selection. IEEE Trans. Signal Process. 60(2), 634–647 (2012)Google Scholar
  3. 42.
    K. Dabov, A. Foi, V. Katkovnik, K. Egiazarian, Image denoising by sparse 3-D transform-domain collaborative filtering. IEEE Trans. Image Process. 16(8), 2080–2095 (2007)Google Scholar
  4. 44.
    W. Dai, T. Xu, W. Wang, Simultaneous codeword optimization (SimCO) for dictionary update and learning. IEEE Trans. Signal Process. 60(12), 6340–6353 (2012)Google Scholar
  5. 67.
    K. Engan, S.O. Aase, J.H. Husoy, Method of optimal directions for frame design, in IEEE International Conference on Acoustics Speech Signal Proceedings, vol. 5 (1999), pp. 2443–2446Google Scholar
  6. 80.
    G.H. Golub, C. Van Loan, Matrix Computations, 4th edn. (Johns Hopkins University Press, Baltimore, 2013)Google Scholar
  7. 98.
    P. Irofti, B. Dumitrescu, GPU parallel implementation of the approximate K-SVD algorithm using OpenCL, in 22nd European Signal Processing Conference (2014), pp. 271–275Google Scholar
  8. 99.
    P. Irofti, B. Dumitrescu, Overcomplete dictionary design: the impact of the sparse representation algorithm, in The 20th International Conference on Control Systems and Computer Science (2015), pp. 901–908Google Scholar
  9. 101.
    P. Irofti, B. Dumitrescu, Overcomplete dictionary learning with Jacobi atom updates, in 39th International Conference on Telecommunications and Signal Processing (2016), pp. 421–424Google Scholar
  10. 113.
    K. Kreutz-Delgado, J.F. Murray, B.D. Rao, K. Engan, T.W. Lee, T.J. Sejnowski. Dictionary learning algorithms for sparse representation. Neural Comput. 15(2), 349–396 (2003)Google Scholar
  11. 124.
    B. Mailhé, M. Plumbley, Fixed points of dictionary learning algorithms for sparse representations. IEEE Trans. Inf. Theory (2013, submitted) https://hal.inria.fr/file/index/docid/807545/filename/locOpt.pdf
  12. 141.
    B.A. Olshausen, D.J. Field, Emergence of simple-cell receptive field properties by learning a sparse code for natural images. Nature 381, 607–609 (1996)Google Scholar
  13. 142.
    B.A. Olshausen, D.J. Field, Sparse coding with an overcomplete basis set: a strategy employed by V1? Vision Res. 37(23), 3311–3325 (1997)Google Scholar
  14. 154.
    L. Rebollo-Neira, Dictionary redundancy elimination. IEE Process. Vis. Image Signal Process. 151(1), 31–34 (2004)Google Scholar
  15. 156.
    R. Rubinstein, M. Zibulevsky, M. Elad, Efficient implementation of the K-SVD algorithm using batch orthogonal matching pursuit. Technical Report CS-2008-08, Technion University, Haifa (2008)Google Scholar
  16. 167.
    M. Sadeghi, M. Babaie-Zadeh, C. Jutten, Dictionary learning for sparse representation: a novel approach. IEEE Signal Process. Lett. 20(12), 1195–1198 (2013)Google Scholar
  17. 168.
    M. Sadeghi, M. Babaie-Zadeh, C. Jutten, Learning overcomplete dictionaries based on atom-by-atom updating. IEEE Trans. Signal Process. 62(4), 883–891 (2014)Google Scholar
  18. 169.
    S. K. Sahoo, A. Makur, Dictionary training for sparse representation as generalization of K-means clustering. IEEE Signal Process. Lett. 20(6), 587–590 (2013)Google Scholar
  19. 180.
    L.N. Smith, M. Elad, Improving dictionary learning: multiple dictionary updates and coefficient reuse. IEEE Signal Process. Lett. 20(1), 79–82 (2013)Google Scholar
  20. 183.
    B.L. Sturm, M.G. Christensen, Comparison of orthogonal matching pursuit implementations, in Proceedings of European Signal Processing Conference (EUSIPCO), Bucharest (2012), pp. 220–224Google Scholar
  21. 202.
    A.G. Weber, The USC-SIPI image database (1997)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Bogdan Dumitrescu
    • 1
  • Paul Irofti
    • 2
  1. 1.Department of Automatic Control and Systems Engineering, Faculty of Automatic Control and ComputersUniversity Politehnica of BucharestBucharestRomania
  2. 2.Department of Computer Science, Faculty of Mathematics and Computer ScienceUniversity of BucharestBucharestRomania

Personalised recommendations