Part of the Synthese Library book series (SYLI, volume 115)


We ordinarily think of induction as ‘learning from experience’, and the question naturally arises how to make such learning proceed most rapidly. The answer, in rough and ready terms, is to ask searching questions, and experimentation is conceived, in this spirit, as the art of asking such questions or probing Nature for her secrets. Now searching questions are those which promise to shed most light on the problem of interest, in a word, to deliver the highest expected yield of information. This already suggests the relevance of information theory, but it does not establish it, for information theory was developed by communication theorists and engineers to solve problems whose connection with efficient experimentation is less than obvious. Yet, as we will see, the connections are there all right, and it is part of our task in this chapter to articulate them.


True State Conditional Entropy Joint Entropy Uniform Partition Discrimination Information 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. Adams, E. W.: 1966, ‘On the Nature and Purpose of Measurement’, Synthese 16, 125–168.CrossRefGoogle Scholar
  2. Atneave, F. J., 1954, ‘Some Informational Aspects of Visual Perception’, Psych. Rev. 61, 183–193.CrossRefGoogle Scholar
  3. Atneave, F. J.: 1959, Applications of Information Theory to Psychology, Holt, Rinehart, and Winston, New York.Google Scholar
  4. Blackwell, D. and Girshick, M. A.: 1954, Theory of Games and Statistical Decisions, Wiley, New York.Google Scholar
  5. Braga-Illa, A.: 1964, ‘A Simple Approach to the Bayes Choice Criterion: The Method of Extreme Probabilities’, J. Amer. Stat. Ass. 59, 1227–1230.CrossRefGoogle Scholar
  6. Cramer, H.: 1946, Methods of Mathematical Statistics, Princeton University Press, Princeton, New Jersey.Google Scholar
  7. Edwards, A. W. F.: Likelihood, Cambridge University Press, Cambridge.Google Scholar
  8. Fisher, R. A.: 1922, ‘On the Foundations of Theoretical Statistics’, Phil. Trans. Roy. Soc. London A222, 309–368.Google Scholar
  9. Garner, W. R.: 1962, Uncertainty and Structure as Psychological Concepts, Wiley, N.Y.Google Scholar
  10. Good, I. J.: 1950, Probability and the Weighing of Evidence, Griffin, London.Google Scholar
  11. Good, I. J.: 1956, ‘Some Terminology and Notation in Information Theory’, Proc. I.E.E., Part C, pp. 103, 200–204.Google Scholar
  12. Goosens, W. K.: 1970, The Logic of Experimentation, Doctoral Dissertation, Stanford University.Google Scholar
  13. Hintikka, K. J. J., and Pietarinen, J.: 1966, ‘Semantic Information and Inductive Logic’, in K. J. J. Hintikka and P. Suppes (eds.), Aspects of Inductive Logic, North-Holland Publ. Co., Amsterdam.Google Scholar
  14. Huzurbazar, V. S.: 1949, ‘On a Property of Distributions Admitting Sufficient Statistics’, Biometrika 36, 71–74.Google Scholar
  15. Khinchin, A. I.: 1957, Mathematical Foundations of Information Theory, transi, from the Russian by R. A. Silverman and M. D. Friedman, Dover, N.Y.Google Scholar
  16. Kullback, S.: 1959, Information Theory and Statistics, Wiley, N.Y. (reprinted: Dover, N.Y., 1968).Google Scholar
  17. Lindley, D. V.: 1956, ‘On a Measure of Information Provided by an Experiment’, Ann. Math. Stat. 29, 986–1005.CrossRefGoogle Scholar
  18. Lindley, D. V.: 1965, Introduction to Probability and Statistics, Vol. 2, Cambridge University Press, Cambridge.CrossRefGoogle Scholar
  19. Marschak, J. and Radner, M.: 1972, Economic Theory of Teams, Yale University Press, New Haven.Google Scholar
  20. Marschak, J.: 1974, ‘Prior and Posterior Probability and Semantic Information’, in G. Menges (ed.), Information, Inference and Decision, D. Reidei Publ. Co., Dordrecht.Google Scholar
  21. Mather, K.: 1951, The Measurement of Linkage in Heredity, Methuen, London.Google Scholar
  22. Mosteller, F.: 1965, Fifty Challenging Problems in Probability with Solutions, Addison-Wesley, Reading, Mass.Google Scholar
  23. Raiffa, H. and Schlaifer, R.: 1961, Applied Statistical Decision Theory, Harvard Business School, Boston.Google Scholar
  24. Raiffa, H.: 1968, Decision Analysis, Addison-Wesley, Reading, Mass.Google Scholar
  25. Reza, F. M.: 1961, An Introduction to Information Theory, McGraw-Hill, New York.Google Scholar
  26. Rosenkrantz, R. D.: 1970, ‘Experimentation as Communication with Nature’, in K. J. J. Hintikka and P. Suppes (eds.), Information and Inference, D. Reidei Publ. Co., Dordrecht.Google Scholar
  27. Savage, L. J.: 1954, Foundations of Statistics, Wiley, New York.Google Scholar
  28. Shannon, C.E. and Weaver, W.: 1949, The Mathematical Theory of Communication, University of Illinois Press, Urbana, III.Google Scholar
  29. Smith, C. A. B.: 1967, Biomathematics, Vol. 2, Hafner Publ. Co., New York.Google Scholar
  30. Sneed, J.: 1966, ‘Entropy, Information and Decision’, Synthese 17, 392–407.CrossRefGoogle Scholar

Copyright information

© D. Reidel Publishing Company, Dordrecht, Holland 1977

Authors and Affiliations

  1. 1.Virginia Polytechnic Institute and State UniversityBlacksburgUSA

Personalised recommendations