Advertisement

Entropy for the Complexity of Physiological Signal Dynamics

  • Xiaohua Douglas ZhangEmail author
Chapter
Part of the Advances in Experimental Medicine and Biology book series (AEMB, volume 1028)

Abstract

Recently, the rapid development of large data storage technologies, mobile network technology, and portable medical devices makes it possible to measure, record, store, and track analysis of biological dynamics. Portable noninvasive medical devices are crucial to capture individual characteristics of biological dynamics. The wearable noninvasive medical devices and the analysis/management of related digital medical data will revolutionize the management and treatment of diseases, subsequently resulting in the establishment of a new healthcare system. One of the key features that can be extracted from the data obtained by wearable noninvasive medical device is the complexity of physiological signals, which can be represented by entropy of biological dynamics contained in the physiological signals measured by these continuous monitoring medical devices. Thus, in this chapter I present the major concepts of entropy that are commonly used to measure the complexity of biological dynamics. The concepts include Shannon entropy, Kolmogorov entropy, Renyi entropy, approximate entropy, sample entropy, and multiscale entropy. I also demonstrate an example of using entropy for the complexity of glucose dynamics.

Keywords

High-throughput phenotyping Entropy Complexity Wearable medical device Continuous monitoring 

References

  1. 1.
    Clausius R (1854) Poggendoff’s Ann., December 1854 xciii:481Google Scholar
  2. 2.
    Costa M, Goldberger AL, Peng CK (2002) Multiscale entropy analysis of complex physiologic time series. Phys Rev Lett 89(068):102Google Scholar
  3. 3.
    Costa M, Goldberger AL (2015) Generalized multiscale entropy analysis: application to quantifying the complex volatility of human heartbeat time series. Entropy 17:1197–1203CrossRefPubMedPubMedCentralGoogle Scholar
  4. 4.
    Eckmann JP, Ruelle D (1985) Ergodic theory of chaos and strange attractors. Rev Mod Phys 57:617–654CrossRefGoogle Scholar
  5. 5.
    Elenko E, Underwood L, Zohar D (2015) Defining digital medicine. Nat Biotechnol 33:456–461CrossRefPubMedGoogle Scholar
  6. 6.
    Grassberger P, Procaccia I (1983) Estimation of the Kolmogorov entropy from a chaotic signal. Phys Rev A 28:2591–2593CrossRefGoogle Scholar
  7. 7.
    Grassberger P, Procaccia I (1983) Measuring the strangeness of strange attractors. Physica D 9:189–208CrossRefGoogle Scholar
  8. 8.
    Kolmogorov AN (1958) New metric invariant of transitive dynamical systems and endomorphisms of Lebesgue spaces. Dokl Russ Acad Sci 119(N5):861–864Google Scholar
  9. 9.
    Pincus SM (1991) Approximate entropy as a measure of system complexity. Proc Natl Acad Sci U S A 88:2297–2301CrossRefPubMedPubMedCentralGoogle Scholar
  10. 10.
    Rényi A (1961) On measures of information and entropy. Proceedings of the fourth Berkeley Symposium on Mathematics, Statistics and Probability 1960. pp 547–561Google Scholar
  11. 11.
    Richman JS, Moorman JR (2000) Physiological time-series analysis using approximate entropy and sample entropy. Am J Physiol Heart Circ Physiol 278:H2039–H2049PubMedGoogle Scholar
  12. 12.
    Shannon CE (1948) A mathematical theory of communication. Bell Syst Tech J 27(3):379–423CrossRefGoogle Scholar
  13. 13.
    Takens F (1983) Invariants related to dimension and entropy. Proceedings of the Thirteenth Coloquio Brasileiro de Matematicas (Rio de Janerio)Google Scholar
  14. 14.
    Zhang YC (1991) Complexity and 1/f noise. A phase space approach. J Phys I France 1:971–977CrossRefGoogle Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2017

Authors and Affiliations

  1. 1.Faculty of Health SciencesUniversity of MacauTaipaChina

Personalised recommendations