About this book
The Minimum Message Length (MML) Principle is an information-theoretic approach to induction, hypothesis testing, model selection, and statistical inference. MML, which provides a formal specification for the implementation of Occam's Razor, asserts that the ‘best’ explanation of observed data is the shortest. Further, an explanation is acceptable
(i.e. the induction is justified) only if the explanation is shorter than the original data.
This book gives a sound introduction to the Minimum Message Length Principle and its applications, provides the theoretical arguments for the adoption of the principle, and shows the development of certain approximations that assist its practical application. MML appears also to provide both a normative and a descriptive basis for inductive reasoning generally, and scientific induction in particular. The book describes this basis and aims to show its relevance to the Philosophy of Science.
Statistical and Inductive Inference by Minimum Message Length will be of special interest to graduate students and researchers in Machine Learning and Data Mining, scientists and analysts in various disciplines wishing to make use of computer techniques for hypothesis discovery, statisticians and econometricians interested in the underlying theory of their discipline, and persons interested in the Philosophy of Science. The book could also be used in a graduate-level course in Machine Learning and Estimation and Model-selection, Econometrics and Data Mining.
C.S. Wallace was appointed Foundation Chair of Computer Science at Monash University in 1968, at the age of 35, where he worked until his death in 2004. He received an ACM Fellowship in 1995, and was appointed Professor Emeritus in 1996. Professor Wallace made numerous significant contributions to diverse areas of Computer Science, such as Computer Architecture, Simulation and Machine Learning. His final research focused primarily on the Minimum Message Length Principle.
- Book Title Statistical and Inductive Inference by Minimum Message Length
- Series Title Information Science and Statistics
- DOI https://doi.org/10.1007/0-387-27656-4
- Copyright Information Springer Science+Business Media, Inc. 2005
- Publisher Name Springer, New York, NY
- eBook Packages Mathematics and Statistics Mathematics and Statistics (R0)
- Hardcover ISBN 978-0-387-23795-4
- Softcover ISBN 978-1-4419-2015-7
- eBook ISBN 978-0-387-27656-4
- Series ISSN 1613-9011
- Edition Number 1
- Number of Pages XVI, 432
- Number of Illustrations 22 b/w illustrations, 0 illustrations in colour
Statistical Theory and Methods
Coding and Information Theory
Probability and Statistics in Computer Science
- Buy this book on publisher's site
From the reviews:
"The subject matter is highly technical, and the book is correspondingly detailed. The book is intended for graduate-level courses, and should be effective in that role if the instructor is sufficiently expert in the area. For researchers at the postdoctoral level, the book will provide a wealth of information about the field.… [T]he book is likely to remain the primary reference in the field for many years to come." (Donald RICHARDS, JASA, June 2009, Vol. 104, No. 486)
"Any statistician interested in the foundations of the discipline, or the deeper philosophical issues of inference, will find this volume a rewarding read." (International Statistical Institute, December 2005)
"This very significant monograph covers the topic of the Minimum Message Length (MML) principle, a new approach to induction, hypothesis testing, model selection, and statistical inference. … This valuable book covers the topics at a level suitable for professionals and graduate students in Statistics, Computer Science, Data Mining, Machine Learning, Estimation and Model-selection, Econometrics etc." (Jerzy Martyna, Zentralblatt MATH, Vol. 1085, 2006)
"This book is around a simple idea: ‘The best explanation of the facts is the shortest’. … The book applies the above idea to statistical estimation in a Bayesian context. … I think it will be valuable for readers who have at the same time strong interest in Bayesian decision theory and in Shannon information theory." (Michael Kohler, Metrika, Vol. 64, 2006)