Abstract
In this chapter, we will try to explain the concept of information, discrete entropy and mutual information in details. To master on the information theory subjects, the reader should have a knowledge of probability and random variables. For this reason, we suggest to the reader to review the probability and random variables topics before studying the information theory subjects. Continuous entropy and continuous mutual information are very closely related to discrete entropy and discrete mutual information. For this reason, the reader should try to understand very well the fundamental concepts explained in this chapter, then proceed with the other chapters of the book.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2018 Springer Nature Singapore Pte Ltd.
About this chapter
Cite this chapter
Gazi, O. (2018). Concept of Information, Discrete Entropy and Mutual Information. In: Information Theory for Electrical Engineers . Signals and Communication Technology. Springer, Singapore. https://doi.org/10.1007/978-981-10-8432-4_1
Download citation
DOI: https://doi.org/10.1007/978-981-10-8432-4_1
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-10-8431-7
Online ISBN: 978-981-10-8432-4
eBook Packages: EngineeringEngineering (R0)