Abstract
This chapter is an introduction to the basics of Bayesian probability theory, information theory, artificial neural networks and statistical signal processing. The material presented here was selected from several textbooks. The goal of this chapter is to cover the basic notions and terminologies used throughout the thesis. It is not intended to give a broad understanding of the theories but rather to recall some definitions and their relations to each other.
Everything should be made as simple as possible, but not simpler.
Albert Einstein
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Notes
It is available online via his homepage: http://wol.ra.phy.cam.ca.uk/mackay/itprnn/#book
Note that throught the book, H(.) is referred to as differential entropy. h(x) was merely used in this section to distinguish between the entropies of a discrete random variable (RV) AND A CONTINUOUS RV.
The three terms can be used interchangeably in this context.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1998 Springer Science+Business Media Dordrecht
About this chapter
Cite this chapter
Lee, TW. (1998). Basics. In: Independent Component Analysis. Springer, Boston, MA. https://doi.org/10.1007/978-1-4757-2851-4_1
Download citation
DOI: https://doi.org/10.1007/978-1-4757-2851-4_1
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4419-5056-7
Online ISBN: 978-1-4757-2851-4
eBook Packages: Springer Book Archive