Abstract
There has been much interest recently in the use of neural networks to solve complicated information processing problems such as those which arise in signal and image processing. In this paper we review Markov random field (MRF) neural network techniques for representing joint probability density functions (PDF). The “Boltzmann machine” serves as the paradigm, and we present a generalised version of its learning algorithm. We also present a technique for designing MRF potentials with low information redundancy for modelling image texture. To improve further the computational efficiency of such neural networks we introduce a novel method of cluster decomposing a PDF by using topographic mappings. The outcome of this programme is a means of designing sampling functions for extracting information from datasets (typically images).
Keywords
- Markov Random Field
- Joint Probability Density Function
- Sampling Function
- Probability Density Function
- Gibbs Distribution
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Ackley D H, Hinton G E and Sejnowski T J, 1985, Cogn. Sci., 9, 147–169, ‘A learning algorithm for Boltzmann machines’.
Besag J, 1974, J. R. Statist. Soc, Ser. B, 36, 192–236, ‘Spatial interaction and the statistical analysis of lattice systems’.
DeGroot, 1970, Optimal statistical decisions, New York, McGraw—Hill.
Geman S and Geman D, 1984, IEEE PAMI, 6(6), 721–741, ‘Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images’.
Geman S and Graffigne C, Proc. Int. Cong. Math. 1986, Ed. Gleason A M, Am. Math. Soc., Providence, 1987, ‘Markov random field image models and their applications to computer vision’.
Jaynes E T, 1957, Phys. Rev., 106(4), 620–630, ‘Information theory and statistical mechanics’.
Jaynes E T, 1957, Phys. Rev., 108(2), 171–190, ‘Information theory and statistical mechanics. II’.
Jaynes E T, 1968, IEEE Trans. SSC, 4(3), 227–241, ‘Prior probabilities’.
Jaynes E T, 1982, Proc. IEEE, 70(9), 939–952, ‘On the rationale of maximum—entropy methods’.
Kindermann R and Snell J L, 1980, Markov Random Fields and their Applications, Contemporary Mathematics, Vol. 1, Am. Math. Soc., Providence, Rhode Island.
Kohonen T, 1984, Self organisation and associative memory, Springer—Verlag.
Luttrell S P, 1985, RSRE Memo., 3815, ‘The implications of Boltzmann—type machines for SAR data processing’.
Luttrell S P, 1987a, Inv. Prob., 3, 289–300, ‘The use of Markov random field models to derive sampling schemes for inverse texture problems’.
Luttrell S P, 1987b, Proc. AGARD Conf. on Scattering and Propagation in Random Media, ‘Markov random fields: a strategy for clutter modelling’.
Luttrell S P, 1987c, Proc. SPIE Int. Symp. on Inverse Problems in Optics, 808, 182–188, Ed. Pike E R, ‘The use of Markov Random field models in sampling scheme design’.
Luttrell S P, 1987d, Proc. IEE RADAR-87 Conf., 222–226, ‘Designing Markov random field structures for clutter modelling’.
Luttrell S P, 1988a, to be published in Inv. Prob., ‘A maximum entropy approach to sampling function design’.
Luttrell S P, 1988b, Proc. IGARSS’88 Conf. on Remote Sending Moving Towards the 21st Century, ‘Image compression using a neural network’.
Luttrell S P, 1988c, submitted to Patt. Recog. Letts., ‘Image compression using a multilayer neural network’.
Metropolis N, Rosenbluth A W, Rosenbluth M N and Teller A H, 1953, J. Chem. Phys., 21, 1087–1092, ‘Equation of state calculations by fast computing machines’.
Preston C J, 1974, Gibbs States on Countable Sets, Cambridge University Press.
Sejnowski T J, 1986, Proc. Conf. on Neural networks for Computing, Vol. 151, Am. Inst. Phys., Snowbird, Utah, ‘Higher order Boltzmann machines’.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1989 Springer Science+Business Media Dordrecht
About this chapter
Cite this chapter
Luttrell, S.P. (1989). The Use of Bayesian and Entropic Methods in Neural Network Theory. In: Skilling, J. (eds) Maximum Entropy and Bayesian Methods. Fundamental Theories of Physics, vol 36. Springer, Dordrecht. https://doi.org/10.1007/978-94-015-7860-8_37
Download citation
DOI: https://doi.org/10.1007/978-94-015-7860-8_37
Publisher Name: Springer, Dordrecht
Print ISBN: 978-90-481-4044-2
Online ISBN: 978-94-015-7860-8
eBook Packages: Springer Book Archive