Skip to main content

Neural Network Image Deconvolution

  • Chapter
Maximum Entropy and Bayesian Methods

Part of the book series: Fundamental Theories of Physics ((FTPH,volume 62))

Abstract

We examine the problem of deconvolving blurred text. This is a task in which there is strong prior knowledge (e.g., font characteristics) that is hard to express computationally. These priors are implicit, however, in mock data for which the true image is known. When trained on such mock data, a neural network is able to learn a solution to the image deconvolution problem which takes advantage of this implicit prior knowledge. Prior knowledge of image positivity can be hard-wired into the functional architecture of the network, but we leave it to the network to learn most of the parameters of the task from the data. We do not need to tell the network about the point spread function, the intrinsic correlation function, or the noise process.

Neural networks have been compared with the optimal linear filter, and with the Bayesian algorithm MemSys, on a variety of problems. The networks, once trained, were faster image reconstructors than MemSys, and had similar performance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. S.F. Gull. Developments in maximum entropy data analysis. In J. Skilling, editor, Maximum Entropy and Bayesian Methods, Cambridge 1988, pages 53–71, Dordrecht, 1989. Kluwer.

    Google Scholar 

  2. S.F. Gull and G.J. Daniell. Image reconstruction from incomplete and noisy data. Nature, 272: 686–690, 1978.

    Article  Google Scholar 

  3. I. Guyon, V.N. Vapnik. B.E. Boser, L.Y. Bottou, and S.A. Solla. Structural risk minimization for character recognition. In J.E. Moody, S.J. Hanson, and R.P. Lippmann, editors, Advances in Neural Information Processing Systems 4, pages 471–479, San Mateo, California, 1992. Morgan Kaufmann.

    Google Scholar 

  4. D.J.C. MacKay. The evidence framework applied to classification networks. Neural Computation, 4 (5): 698–714, 1992.

    Article  Google Scholar 

  5. D.J.C. MacKay. A practical Bayesian framework for backpropagation networks. Neural Computation, 4 (3): 448–472, 1992.

    Article  Google Scholar 

  6. J. Skilling. Classic maximum entropy. In J. Skilling, editor, Maximum Entropy and Bayesian Methods, Cambridge 1988, Dordrecht, 1989. Kluwer.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to David J. C. MacKay .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1996 Springer Science+Business Media Dordrecht

About this chapter

Cite this chapter

Tansley, J.E., Oldfield, M.J., MacKay, D.J.C. (1996). Neural Network Image Deconvolution. In: Heidbreder, G.R. (eds) Maximum Entropy and Bayesian Methods. Fundamental Theories of Physics, vol 62. Springer, Dordrecht. https://doi.org/10.1007/978-94-015-8729-7_25

Download citation

  • DOI: https://doi.org/10.1007/978-94-015-8729-7_25

  • Publisher Name: Springer, Dordrecht

  • Print ISBN: 978-90-481-4407-5

  • Online ISBN: 978-94-015-8729-7

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics