Advertisement

Circuits, Systems, and Signal Processing

, Volume 38, Issue 2, pp 569–589 | Cite as

Neural Networks for Compressed Sensing Based on Information Geometry

  • Meng Wang
  • Chuang-Bai Xiao
  • Zhen-Hu NingEmail author
  • Tong Li
  • Bei Gong
Article
  • 106 Downloads

Abstract

Neural networks that are embedded with prior knowledge of the distribution of the original signal in applications of compressed sensing have attracted increasing attention. However, the maximal probability of the desired output by a neural network cannot guarantee that the statistical distribution of the recovered signal is consistent with the statistical distribution of the original signal. In this paper, we combine neural networks with information geometry to study the recovery of sparse signals that satisfy a certain distribution. We construct the geodesic distance between the distribution of the original signal and distribution of the recovered signal as the input for the neural network. Experiments show that the proposed method has a better reconstruction quality compared with existing algorithms.

Keywords

Sparse recovery Information geometry Geodesic distance Neural network 

References

  1. 1.
    S. Amari, Natural gradient works efficiently in learning. Neural Comput. 10(2), 251–276 (1998)CrossRefGoogle Scholar
  2. 2.
    S. Amari, Information geometry on hierarchy of probability distributions. IEEE Trans. Inf. Theory 47(5), 1701–1711 (2001)MathSciNetCrossRefzbMATHGoogle Scholar
  3. 3.
    S.I. Amari, Information geometry and its applications. J. Math. Psychol. 49(1), 101–102 (2005)Google Scholar
  4. 4.
    S. Amari, Information geometry in optimization, machine learning and statistical inference. Front. Electr. Electron. Eng. China 5(3), 241–260 (2010)CrossRefGoogle Scholar
  5. 5.
    S. Amari, M. Kawanabe, Information geometry of estimating functions in semi-parametric statistical models. Bernoulli 3(1), 29–54 (1997)MathSciNetCrossRefzbMATHGoogle Scholar
  6. 6.
    T. Ardeshiri, K. Granstrom, E. Ozkan, U. Orguner, Greedy reduction algorithms for mixtures of exponential family. IEEE Signal Process. Lett. 22(6), 676–680 (2015)CrossRefGoogle Scholar
  7. 7.
    K.A. Arwini, C.T.J. Dodson, A.J. Doig, W.W. Sampson, J. Scharcanski, S. Felipussi, J.M. Morel, F. Takens, B. Teissier, Information Geometry: Near Randomness and Near Independence (Springer, Berlin, 2008)CrossRefGoogle Scholar
  8. 8.
    R.G. Baraniuk, V. Cevher, M.F. Duarte, C. Hegde, Model-based compressive sensing. IEEE Trans. Inf. Theory 56(4), 1982–2001 (2010)MathSciNetCrossRefzbMATHGoogle Scholar
  9. 9.
    D. Baron, S. Sarvotham, R.G. Baraniuk, Bayesian compressive sensing via belief propagation. IEEE Signal Process. 58(3), 269–280 (2010)MathSciNetCrossRefzbMATHGoogle Scholar
  10. 10.
    T. Blumensath, M.E. Davies, Iterative hard thresholding for compressed sensing. Appl. Comput. Harmon. Anal. 27(3), 265–274 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  11. 11.
    A. Buecher, J. Segers, Maximum likelihood estimation for the Frechet distribution based on block maxima extracted from a time series. Bernoulli 24(2), 1427–1462 (2018)MathSciNetCrossRefzbMATHGoogle Scholar
  12. 12.
    H.Q. Bui, C.N.H. La, M.N. Da, A fast tree-based algorithm for compressed sensing with sparse-tree prior. Signal Process. 108(10), 628–641 (2015)CrossRefGoogle Scholar
  13. 13.
    L.L. Campbell, The relation between information theory and the differential, geometry approach to statistics. Inf. Sci. 35(3), 199–210 (1985)MathSciNetCrossRefzbMATHGoogle Scholar
  14. 14.
    E.J. Candes, T. Tao, Decoding by linear programming. IEEE Trans. Inf. Theory 51(12), 4203–4215 (2005)MathSciNetCrossRefzbMATHGoogle Scholar
  15. 15.
    E.J. Candes, T. Tao, Near-optimal signal recovery from random projections: universal encoding strategies. IEEE Trans. Inf. Theory 52(12), 5406–5425 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  16. 16.
    E.J. Candes, M.B. Wakin, An introduction to compressive sampling. IEEE Signal Process. Mag. 25(2), 21–30 (2008)CrossRefGoogle Scholar
  17. 17.
    E.J. Candes, J. Romberg, T. Tao, Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information. IEEE Trans. Inf. Theory 52(2), 489–509 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  18. 18.
    E.J. Candès, J.K. Romberg, T. Tao, Stable signal recovery from incomplete and inaccurate measurements. Commun. Pure Appl. Math. 59(8), 1207–1223 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  19. 19.
    A.Y. Carmi, L. Mihaylova, S.J. Godsill, Compressed Sensing and Sparse Filtering (Springer, Berlin, 2014)CrossRefzbMATHGoogle Scholar
  20. 20.
    S.S. Chen, D.L. Donoho, M.A. Saunders, Atomic decomposition by basis pursuit. SIAM J. Sci. Comput. 20(1), 33–61 (1998)MathSciNetCrossRefzbMATHGoogle Scholar
  21. 21.
    I. Daubechies, M. Defrise, D. Mol, An iterative thresholding algorithm for linear inverse problems with a sparsity constraint. Commun. Pure Appl. Math. 57(2), 1413–1457 (2004)MathSciNetCrossRefzbMATHGoogle Scholar
  22. 22.
    S. Dineen, A.M. Society, Probability theory in finance: a mathematical guide to the Black–Scholes formula. Am. Math. Soc. 10(8), 164–187 (2011)Google Scholar
  23. 23.
    D.L. Donoho, Compressed sensing. IEEE Trans. Inf. Theory 52(12), 1289–1306 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  24. 24.
    D.L. Donoho, Y. Tsaig, I. Drori, J.L. Starck, Sparse solution of underdetermined systems of linear equations by stagewise orthogonal matching pursuit. IEEE Trans. Inf. Theory 58(22), 1094–1121 (2012)MathSciNetCrossRefzbMATHGoogle Scholar
  25. 25.
    T.V. Duong, D.Q. Phung, H.H. Bui, S. Venkatesh, Human behavior recognition with generic exponential family duration modeling in the hidden semi-Markov model, in International Conference on Pattern Recognition (2006), pp. 202–208Google Scholar
  26. 26.
    O.D. Escoda, L. Granai, P. Vandergheynst, On the use of a priori information for sparse signal approximations. IEEE Trans. Signal Process. 54(9), 3468–3482 (2006)CrossRefzbMATHGoogle Scholar
  27. 27.
    M.A.T. Figueiredo, R.D. Nowak, S.J. Wright, Gradient projection for sparse reconstruction: application to compressed sensing and other inverse problems. IEEE J. Sel. Top. Signal Process. 1(4), 586–597 (2007)CrossRefGoogle Scholar
  28. 28.
    A. Flinth, Optimal choice of weights for sparse recovery with prior information. IEEE Trans. Inf. Theory 62(7), 4276–4284 (2016)MathSciNetCrossRefzbMATHGoogle Scholar
  29. 29.
    B.R. Frieden, Science from Fisher Information: A Unification (Cambridge University Press, Cambridge, 2004)CrossRefzbMATHGoogle Scholar
  30. 30.
    M.P. Friedlander, H. Mansour, R. Saab, O. Yilmaz, Recovering compressively sampled signals using partial support information. IEEE Trans. Inf. Theory 58(2), 1122–1134 (2012)MathSciNetCrossRefzbMATHGoogle Scholar
  31. 31.
    R. Garg, R. Khandekar, Gradient descent with sparsification: an iterative algorithm for sparse recovery with restricted isometry property, in Proceedings of the 26th International Conference On Machine Learning (2009), pp. 337–344Google Scholar
  32. 32.
    A. Gilbert, S. Guha, P. Indyk, S. Muthukrishnan, M. Strauss, Near-optimal sparse Fourier representations via sampling, in Proceedings of the 34th Annual ACM Symposium on Theory of Computing (2000), pp. 152–161Google Scholar
  33. 33.
    G. Gormode, S. Muthukrishan, Combinatorial algorithms for Compressed sensing, in Proceedings of the 40th Annual Conference on Information Sciences and Systems (2006), pp. 280–294Google Scholar
  34. 34.
    Q. Huynh-Thu, M. Ghanbari, Scope of validity of PSNR in image/video quality assessment. Electron. Lett. 44(13), 800 (2008)CrossRefGoogle Scholar
  35. 35.
    R.E. Kass, P.W. Vos, Geometrical Foundations of Asymptotic Inference (Wiley, New York, 2011)zbMATHGoogle Scholar
  36. 36.
    M.A. Khajehnejad, W. Xu, A.S. Avestimehr, B. Hassibi, Analyzing weighted l(1) minimization for sparse recovery with nonuniform sparse models. IEEE Trans. Signal Process. 59(5), 1985–2001 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  37. 37.
    C. La, M.N. Do, Signal reconstruction using sparse tree representation, in Proceedings of Wavelets XI at SPIE Optics and Photonics (2005), pp. 120–125Google Scholar
  38. 38.
    C. La, M.N. Do, Tree-based orthogonal matching pursuit algorithm for signal reconstruction, in IEEE International Conference on Image Processing (2006), pp. 1277–1285Google Scholar
  39. 39.
    S.G. Mallat, Z. Zhang, Matching pursuits with time-frequency dictionaries. IEEE Trans. Inf. Theory 41(12), 3397–3415 (1993)zbMATHGoogle Scholar
  40. 40.
    M.L. Menendez, D. Morales, L. Pardo, M. Salicrij, Statistical tests based on geodesic distances. Appl. Math. Lett. 8(1), 65–69 (1995)MathSciNetCrossRefzbMATHGoogle Scholar
  41. 41.
    D. Merhej, C. Diab, M. Khalil, R. Prost, Embedding prior knowledge within compressed sensing by neural networks. IEEE Trans. Neural Netw. 22(10), 1638–1649 (2011)CrossRefGoogle Scholar
  42. 42.
    S. Muthukrishnan, Data streams: algorithms and applications. Found. Trends Theor. Comput. Sci. 1(2), 413-413 (2003)MathSciNetGoogle Scholar
  43. 43.
    D. Needell, J.A. Tropp, CoSaMP: iterative signal recovery from incomplete and inaccurate samples. Appl. Comput. Harmon. Anal. 26(3), 301–321 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  44. 44.
    D. Needell, R. Vershynin, Uniform uncertainty principle and signal recovery via regularized orthogonal matching pursuit. Found. Comput. Math. 9(3), 317–334 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  45. 45.
    F. Nielsen, Pattern learning and recognition on statistical manifolds. Int. Workshop Similarity Based Pattern Recognit. 79(53), 1–25 (2013)Google Scholar
  46. 46.
    F. Nielsen, V. Garcia. Statistical exponential families: a digest with flash cards (2009). arXiv:0911.4863
  47. 47.
    B.J. Oommen, A. Thomas, “Anti-Bayesian” parametric pattern classification using order statistics criteria for some members of the exponential family. Pattern Recogn. 47(1SI), 40–55 (2014)CrossRefzbMATHGoogle Scholar
  48. 48.
    H. Palangi, R. Ward, L. Deng, Distributed compressive sensing: a deep learning approach. IEEE Trans. Signal Process. 64(17), 4504–4518 (2016)MathSciNetCrossRefGoogle Scholar
  49. 49.
    H. Palangi, R. Ward, L. Deng, Convolutional deep stacking networks for distributed compressive sensing. Signal Process. 131, 181–189 (2017)CrossRefGoogle Scholar
  50. 50.
    I. Rish, G. Grabarnik, Sparse signal recovery with exponential-family noise, in 47th Annual Allerton Conference on Communication, Control, and Computing (2009), pp. 60–66Google Scholar
  51. 51.
    M.W. Seeger, Bayesian inference and optimal design for the sparse linear model. J. Mach. Learn. Res. 9, 759–813 (2008)MathSciNetzbMATHGoogle Scholar
  52. 52.
    J.A. Tropp, A.C. Gilbert, Signal recovery from random measurements via orthogonal matching pursuit. IEEE Trans. Inf. Theory 53(12), 4655–4666 (2007)MathSciNetCrossRefzbMATHGoogle Scholar
  53. 53.
    N. Vaswani, W. Lu, Modified-CS: modifying compressive sensing for problems with partially known support. IEEE Trans. Signal Process. 58(9), 4595–4607 (2010)MathSciNetCrossRefzbMATHGoogle Scholar
  54. 54.
    J. Wang, S. Kwon, B. Shim, Generalized orthogonal matching pursuit. IEEE Signal Process. 60(1212), 6202–6216 (2012)MathSciNetCrossRefzbMATHGoogle Scholar
  55. 55.
    Y. Xiang, J. Donley, E. Seletskaia, S. Shingare, J. Kamerud, B. Gorovits, A simple approach to determine a curve fitting model with a correct weighting function for calibration curves in quantitative ligand binding assays. AAPS J. 20(3), 45 (2018)CrossRefGoogle Scholar
  56. 56.
    Y. Yang, Polynomial curve fitting and lagrange interpolation. Math. Comput. Educ. 47(3), 224 (2013)Google Scholar
  57. 57.
    H. Zayyani, M. Babaie-Zadeh, C. Jutten, Bayesian Pursuit algorithm for sparse representation, in 2009 IEEE International Conference on Acoustics, Speech and Signal Processing (2009), pp. 1549–1552Google Scholar
  58. 58.
    Y. Zhang, On Theory of Compressive Sensing via l 1-Minimization Simple Derivations and Extensions. Rice CAAM Department Technical Report (2008)Google Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  • Meng Wang
    • 1
  • Chuang-Bai Xiao
    • 1
  • Zhen-Hu Ning
    • 1
    Email author
  • Tong Li
    • 1
  • Bei Gong
    • 1
  1. 1.Beijing University of TechnologyBeijingChina

Personalised recommendations