Principles of Uncertainty

  • George J. Klir
  • Mark J. Wierman
Part of the Studies in Fuzziness and Soft Computing book series (STUDFUZZ, volume 15)


Although measures of the various types of uncertainty-based information (Table 3.5) are not sufficient in human communication [Cherry, 1957], they are highly effective tools for dealing with systems problems of virtually any kind [Klir, 1985]. For the classical information measures (Hartley function and Shannon entropy), which were originally conceived solely as tools for analyzing and designing telecommunication systems, this broad utility is best demonstrated by Ashby [1958, 1965, 1969, 1972] and Conant [1969, 1974, 1976, 1981].


Maximum Entropy Shannon Entropy Belief Function Possibility Distribution Evidence Theory 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag Berlin Heidelberg 1999

Authors and Affiliations

  • George J. Klir
    • 1
  • Mark J. Wierman
    • 2
  1. 1.Center for Intelligent Systems and Department of Systems Science and Industrial Engineering, Thomas J. Watson School of Engineering and Applied ScienceBinghamton University — SUNYBinghamtonUSA
  2. 2.Center for Research in Fuzzy Mathematics and Computer Science and Mathematics and Computer Science DepartmentCreighton UniversityOmahaUSA

Personalised recommendations