Abstract
We present a new way to understand and characterize the choice of scoring rule (probability loss function) for evaluating the performance of a supplier of probabilistic predictions after the outcomes (true classes) are known. The ultimate value of a prediction (estimate) lies in the actual utility (loss reduction) accruing to one who uses this information to make some decision(s). Often we cannot specify with certainty that the prediction will be used in a particular decision problem, characterized by a particular loss matrix (indexed by outcome and decision), and thus having a particular decision threshold. Instead, we consider the more general case of a distribution over such matrices. The proposed scoring rule is the expectation, with respect to this distribution, of the loss that is actually incurred when following the decision recommendation, the latter being the decision that would be considered optimal if we were to assume the predicted probabilities. Logarithmic and quadratic scoring rules arise from specific examples of these distributions, and even common single-threshold measures such as the ordinary misclassification score obtain from degenerate special cases.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
J. Aczél. Remarks on the measurement of subjective probability and information. Metrika, 11(2):91–105, 1966.
José M. Bernardo. Expected information as expected utility. Ann. Stat., 7 (3): 686–691, 1979.
G. W. Brier. Verification of forecasts expressed in terms of probability. Monthly Weather Review, 78 (1): 1–3, 1950.
P. Fischer. On the inequality Σpi f (pi) ≥ Σ p i f (qi). Metrika, 18: 199–208, 1972.
H. Gish. A probabilistic approach to the understanding and training of neural network classifiers. In IEEE Int’l Conf. on Acoustics, Speech and Signal Processing, pages 1361–1364, April 1990.
I. J. Good. Rational decisions. J. of the Royal Stat. Soc. B, 14: 107–114, 1952.
John B. Hampshire II and Barak Pearlmutter. Equivalence proofs for multi-layer perceptron classifiers and the Bayesian discriminant function. In Connectionist Models: Proc. of the 1990 Summer School, pages 159–172. San Mateo, CA: Morgan Kaufmann Publishers, 1991.
Charles E. Metz. Basic principles of ROC analysis. Seminars Nuclear Med., 8 (4): 283–298, 1978.
John W. Miller, Rod Goodman, and Padhraic Smyth. On loss functions which minimize to conditional expected values and posterior probabilities. IEEE Tr: Information Theory,39(4):1404–1408, 1993. [10]
Gy. Muszély. On continuous solutions of a functional inequality. Metrika, 19: 65–69, 1973.
David B. Rosen. Cross-entropy vs. squared error vs. misclassification: On the relationship among loss functions. Submitted. For preprint info., e-mail d.rosen@ieee.org with Subject: QUERY PAPER CSM.
David B. Rosen. Issues in selecting empirical performance measures for probabilistic classifiers. In Kenneth Hanson and Richard Silver, editors, Maximum Entropy and Bayesian Methods (Proceedings of the Fifteenth International Workshop, July 1995). Kluwer, Dordrecht, The Netherlands, 1996. Paper title subject to revision. To appear. For preprint info., e-mail d.rosen@ieee.org with Subject: QUERY PAPER ISEPM.
Leonard J. Savage. Elicitation of personal probabilities and expectations. J. of the American Stat. Assoc., 66 (336): 783–801, 1971.
Robert L. Winkler. Scoring rules and the evaluation of probability assessors. J. of the American Stat. Assoc., 64: 1073–1078, 1969.
J. Frank Yates. External correspondence: Decompositions of the mean probability score. Organizational Behavior and Human Performance, 30: 132–156, 1982.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1996 Springer Science+Business Media Dordrecht
About this chapter
Cite this chapter
Rosen, D.B. (1996). How Good were those Probability Predictions? The Expected Recommendation Loss (ERL) Scoring Rule. In: Heidbreder, G.R. (eds) Maximum Entropy and Bayesian Methods. Fundamental Theories of Physics, vol 62. Springer, Dordrecht. https://doi.org/10.1007/978-94-015-8729-7_33
Download citation
DOI: https://doi.org/10.1007/978-94-015-8729-7_33
Publisher Name: Springer, Dordrecht
Print ISBN: 978-90-481-4407-5
Online ISBN: 978-94-015-8729-7
eBook Packages: Springer Book Archive