Skip to main content

Communicating Uncertainty to Policymakers: The Ineliminable Role of Values

  • Chapter
  • First Online:
Book cover Climate Modelling

Abstract

Climate science evaluates hypotheses about the climate using computer simulations and complex models. The models that drive these simulations, moreover, represent the efforts of many different agents, and they arise from a compounding set of methodological choices whose effects are epistemically inscrutable. These facts, I argue in this chapter, make it extremely difficult for climate scientists to estimate the degrees of uncertainty associated with these hypotheses that are free from the influences of past preferences—preferences both with regard to importance of one prediction over another and with regard to avoidance of false positive over false negatives and vice versa. This leaves an imprint of non-epistemic values in the nooks and crannies of climate science.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Notes

  1. 1.

    Of course one might have worries about whether elected representatives generally represent the values of their constituents but that is the subject of a different discussion.

  2. 2.

    I variously use the expressions “social values,” “ethical values,” or “social and ethical values” which should not be read as flagging important philosophical differences.

  3. 3.

    See also (Frank 1954; Neurath 1913; Douglas 2000; Howard 2006; Longino 1990, 1996, 2002; Kourany 2003a, b; Solomon 2001; Wilholt 2009; Elliott 2011a, b).

  4. 4.

    Many discussions of UQ in climate science will also identify data uncertainty. In evaluating a particular climate model, including both its structure and parameters, we compare the model’s output to real data. Climate modelers, for example, often compare the outputs of their models to records of past climate. These records can come from actual meteorological observations or from proxy data—inferences about past climate drawn from such sources as tree rings and ice core samples. Both of these sources of data, however, are prone to error, and so we are uncertain about the precise nature of the past climate. This, in turn, has consequences for our knowledge of the future climate. While data uncertainty is a significant source of uncertainty in climate modeling, I do not discuss this source of uncertainty here. For the purposes of this discussion, I make the crude assumption that the data against which climate models are evaluated are known with certainty. Notice, in any case, that data uncertainty is part of parameter uncertainty and structural uncertainty, since it acts by affecting our ability to judge the accuracy of our parameters and our model structures.

  5. 5.

    A parameter for a model is an input that is fixed for all time, while a variable takes a value that varies with time. A variable for a model is thus both an input for the model (the value the variable takes at some initial time) and an output (the value the variable takes at all subsequent times). A parameter is simply an input.

  6. 6.

    Some might argue that if we look at how the models perform on past data (for, say, mean global surface temperature), they often are distributed around the observations. But, first, these distributions do not display anything like random characteristics (i.e., normal distribution). And, second, this feature of one variable for past data (the data for which the models have been tuned) is a poor indicator that it might obtain for all variables and for future data.

  7. 7.

    Masson and Knutti (2011) discuss this phenomenon and its effects on multimodel sampling, in detail.

  8. 8.

    Shewhart (1939).

  9. 9.

    Which, inter alia, did much to bring the issue of “inductive risk” back into focus for contemporary philosophy of science and epistemology.

  10. 10.

    Whether they would do so in fact is not what is at issue here. Surely that would depend on features of their psychology and of the institutional structures they inhabit, about which we would have to have a great deal more empirical evidence before we could decide. What is at stake here is whether their social and ethical values would necessarily play a role in properly conducted science.

  11. 11.

    See, for example, Goldstein and Rougier (2006).

  12. 12.

    For an account of the controversies around early coupling, see Shackley et al. (1999); for a brief history of modeling advances, see Weart (2010).

  13. 13.

    As, for example, in the earth system modeling framework. See, e.g., Dickenson et al. (2002).

  14. 14.

    Because data are being continuously exchanged one can accurately describe the models as parallel rather than serial in the sense discussed in Winsberg (2006).

  15. 15.

    “Balance of approximations” is a term introduced by Lambert and Boer (2001) to indicate that climate models sometimes succeed precisely because the errors introduced by two different approximations cancel each other out.

  16. 16.

    There has been a move, in recent years, to eliminate “legacy code” from climate models. Even though this may have been achieved in some models (this claim is sometimes made about CM2), it is worth noting that there is a large difference between coding a model from scratch and building it from scratch, that is, devising and sanctioning from scratch all of the elements of a model.

  17. 17.

    See Rougier and Crucifix, this volume.

  18. 18.

    I do not have the space to talk about what “manageably small” might mean here. But see our discussion of “catch and toss” group authorship in the work mentioned in the next note.

  19. 19.

    One might reasonably wonder whether, in principle, a group could be an epistemic agent. In fact, this is the subject of a forthcoming paper by Bryce Huebner, Rebecca Kukla, and me. I would argue here, however, and hope that we will argue in more detail in that paper, that the analytic impenetrability of the models made by the groups involved here is an obstacle to these groups being agents with subjective degrees of belief.

  20. 20.

    One can think of the contribution to this volume by Rougier and Crucifix as a recognition of, and attempt to address, this problem: that complex climate models are too complex to help climate scientists develop subjective degrees of belief.

  21. 21.

    See especially Biddle and Winsberg (2009), and also Winsberg (2010, ch. 6).

  22. 22.

    Here, my point is very well supported by Elisabeth Lloyd’s contribution to this volume. Her chapter chronicles in detail a very nice example of the kind of unforced methodological choice I am talking about: the choice of how to calibrate the relevant satellite data. The way Lloyd tells the story, the process involved a whole host of data-processing decisions and choices. I am simply adding to Lloyd’s narrative the observation that each of the decisions and choices she chronicles can be understood as being underwritten by balances of inductive risk and prediction preferences.

  23. 23.

    One might complain that if the decisions do not reflect the explicit psychological motives or interests of the scientist, then they do not have a systematic effect on the content of science, and are hence no different than the uncontroversial examples of social values I mentioned in the introduction (such as attaching greater value to AIDS research than to algebraic quantum field theory). But though the effect of the values in the climate case might not have a systematic effect on the content of science, it is nonetheless an effect internal to science in a way that those other examples are not.

  24. 24.

    Again, Elisabeth Lloyd’s contribution to this volume illustrates this point.

  25. 25.

    This comes from Parker’s remarks at the 2011 meeting of the Eastern division of the American Philosophical Association during an author meets critic session for my (2010).

  26. 26.

    The probability that less than 66% of the probability mass lies inside the gray bar is a second order probability because it talks about the probability of a probability.

Works Cited

  • Allen, Myles. What Can Be Said About Future Climate? ClimatePrediction.net, June. Available at http://www.climateprediction.net/science/pubs/allen_Harvard2008.ppt. Accessed 3 July 2008.

  • Biddle, Justin, and Eric Winsberg. 2009. Value Judgments and the Estimation of Uncertainty in Climate Modeling. In New Waves in the Philosophy of Science, ed. P.D. Magnus and Jacob Busch. New York: Palgrave Macmillan.

    Google Scholar 

  • Churchman, C. West. 1949. Theory of Experimental Inference. New York: Macmillan.

    Google Scholar 

  • ———. 1953. Science and Decision Making. Philosophy of Science 23 (3): 247–249.

    Article  Google Scholar 

  • Clark, Andy. 1987. The Kluge in the Machine. Mind and Language 2 (4): 277–300.

    Article  Google Scholar 

  • Dickenson, Robert E., Stephen E. Zebiak, Jeffery L. Anderson, et al. 2002. How Can We Advance Our Weather and Climate Models as a Community? Bulletin of the American Meteorological Society 83 (3): 431–434.

    Article  Google Scholar 

  • Douglas, Heather. 2000. Inductive Risk and Values in Science. Philosophy of Science 67 (4): 559–579.

    Article  Google Scholar 

  • Dunne, John. 2006. Towards Earth System Modelling: Bringing GFDL to Life. Paper Presented at the ACCESS 2006 BMRC Workshop. Available at http://www.cawcr.gov.au/bmrc/basic/wksp18/papers/Dunne_ESM.pdf. Accessed 11 Jan 2011.

  • Elliot, Kevin. 2011a. Direct and Indirect Roles for Values in Science. Philosophy of Science 78: 303–324.

    Article  Google Scholar 

  • ———. 2011b. Is a Little Pollution Good for You? Incorporating Societal Values in Environmental Research. New York: Oxford University Press.

    Book  Google Scholar 

  • Frank, Philipp G. 1954. The Variety of Reasons for the Acceptance of Scientific Theories. In The Validation of Scientific Theories, ed. Phillipp Frank, 3–17. Boston: Beacon Press.

    Google Scholar 

  • Gleckler, Peter J., Karl E. Taylor, and Charles Doutriaux. 2008. Performance Metrics for Climate Models. Journal of Geophysical Research 113: D06104. https://doi.org/10.1029/ 2007JD008972.

    Article  Google Scholar 

  • Goldstein, Matthew, and Jonathan C. Rougier. 2006. Bayes Linear Calibrated Prediction for Complex Systems. Journal of the American Statistical Association 101 (475): 1132–1143.

    Article  CAS  Google Scholar 

  • Howard, Don A. 2006. Lost Wanderers in the Forest of Knowledge: Some Thoughts on the Discovery-Justification Distinction. In Revisiting Discovery and Justification: Historical and Philosophical Perspectives on the Context Distinction, ed. Jutta Schickore and Friedrich Steinle, 3–22. New York: Springer.

    Chapter  Google Scholar 

  • IPCC (Intergovernmental Panel on Climate Change). 2001. Climate Change 2001: The Scientific Basis. Contribution of Working Group I to the Third Assessment Report of the Intergovernmental Panel on Climate Change. Cambridge: Cambridge University Press.

    Google Scholar 

  • ———. 2007. Climate Change 2007: The Physical Science Basis. Contribution of Working Group I to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change. Cambridge: Cambridge University Press.

    Book  Google Scholar 

  • Jeffrey, Richard C. 1956. Valuation and Acceptance of Scientific Hypotheses. Philosophy of Science 23: 237–246.

    Article  Google Scholar 

  • Kourany, Janet. 2003a. A Philosophy of Science for the Twenty-First Century. Philosophy of Science 70 (1): 1–14.

    Article  Google Scholar 

  • ———. 2003b. Reply to Giere. Philosophy of Science 70 (1): 22–26.

    Article  Google Scholar 

  • Küppers, Günter, and Johannes Lenhard. 2006. Simulation and a Revolution in Modeling Style: From Hierarchical to Network-like Integration. In Simulation: Pragmatic Construction of Reality, Sociology of the Sciences, ed. Johannes Lenhard, Günter Küppers, and Terry Shinn, 89–106. Dordrecht: Springer.

    Chapter  Google Scholar 

  • Lambert, Steven, and George Boer. 2001. CMIP1 Evaluation and Intercomparison of Coupled Climate Models. Climate Dynamics 17 (2–3): 83–106.

    Article  Google Scholar 

  • Lenhard, Johannes, and Eric Winsberg. 2010. Holism, Entrenchment, and the Future of Climate Model Pluralism. Studies in History and Philosophy of Modern Physics 41: 253–262.

    Article  Google Scholar 

  • Lloyd, Elisabeth. 2012. The Role of ‘Complex’ Empiricism in the Debates About Satellite Data and Climate Models. Studies in History and Philosophy of Science 43: 390–401.

    Article  Google Scholar 

  • ———. 2015. Model Robustness as a Confirmatory Virtue: The Case of Climate Science. Studies in History and Philosophy of Science 49: 58–68. https://doi.org/10.1016/j.shpsa.2014.12.002.

    Article  Google Scholar 

  • Longino, Helen. 1990. Science as Social Knowledge: Values and Objectivity in Scientific Inquiry. Princeton: Princeton University Press.

    Google Scholar 

  • ———. 1996. Cognitive and Non-cognitive Values in Science: Rethinking the Dichotomy. In Feminism, Science, and the Philosophy of Science, ed. Lynn Hankinson Nelson and Jack Nelson, 39–58. Dordrecht: Kluwer.

    Chapter  Google Scholar 

  • ———. 2002. The Fate of Knowledge. Princeton: Princeton University Press.

    Google Scholar 

  • Masson, David, and Reto Knutti. 2011. Climate Model Genealogy. Geophysical Research Letters 38 (8): L08703. https://doi.org/10.1029/2011GL046864.

    Article  Google Scholar 

  • Neurath, Otto. 1913. Die Verirrten des Cartesius und das Auxiliarmotiv: Zur Psychologie des Entschlusses. In Jahrbuch der Philosophischen Gesellschaft an der Universität Wien, 45–59. Leipzig: Johann Ambrosius Barth.

    Google Scholar 

  • Palmer, Tim N. 2001. A Nonlinear Dynamical Perspective on Model Error: A Proposal for Non-local Stochastic–Dynamic Parameterization in Weather and Climate Prediction Models. Quarterly Journal of the Royal Meteorological Society 127 (572): 279–304.

    Google Scholar 

  • Parker, Wendy S. 2011. When Climate Models Agree: The Significance of Robust Model Predictions. Philosophy of Science 78 (4): 579–600.

    Article  Google Scholar 

  • Rudner, Richard. 1953. The Scientist Qua Scientist Makes Value Judgments. Philosophy of Science 20 (3): 1–6.

    Article  Google Scholar 

  • Shackley, Simon, James Risbey, Peter Stone, and Brian Wynne. 1999. Adjusting to Policy Expectations in Climate Change Science: An Interdisciplinary Study of Flux Adjustments in Coupled Atmosphere Ocean General Circulation Models. Climatic Change 43 (3): 413–454.

    Article  Google Scholar 

  • Shewhart, Walter A. 1939. Statistical Method from the Viewpoint of Quality Control. New York: Dover.

    Google Scholar 

  • Smith, Leonard A. 2002. What Might We Learn from Climate Forecasts? Proceedings of the National Academy of Sciences USA 4 (99): 2487–2492.

    Article  Google Scholar 

  • Solomon, Miriam. 2001. Social Empiricism. Cambridge, MA: MIT Press.

    Google Scholar 

  • Tebaldi, Claudia, and Reto Knutti. 2007. The Use of the Multimodel Ensemble in Probabilistic Climate Projections. Philosophical Transactions of the Royal Society A 365 (1857): 2053–2075.

    Article  Google Scholar 

  • Weart, Spencer. 2010. The Development of General Circulation Models of Climate. Studies in History and Philosophy of Modern Physics 41 (3): 208–217.

    Article  Google Scholar 

  • Wilholt, Torsten. 2009. Bias and Values in Scientific Research. Studies in History and Philosophy of Science 40: 92–101.

    Article  Google Scholar 

  • Wimsatt, William. 2007. Re-engineering Philosophy for Limited Beings: Piecewise Approximations to Reality. Cambridge, MA: Harvard University Press.

    Google Scholar 

  • Winsberg, Eric. 2006. Handshaking Your Way to the Top: Simulation at the Nanoscale. Philosophy of Science 73 (5): 582–594.

    Article  Google Scholar 

  • ———. 2010. Science in the Age of Computer Simulation. Chicago: University of Chicago Press.

    Book  Google Scholar 

Download references

Acknowledgements

Thanks to Kevin Elliot, Rebecca Kukla, Elisabeth Lloyd, Wendy Parker, Isabelle Peschard, Bas van Fraassen, and Jessica Williams for helpful comments, criticisms, and suggestions as I worked on this manuscript. And thanks to all the participants at conferences and colloquia where I have presented earlier versions of this work, including at the Technical University Eindhoven, San Francisco State University, Georgetown University, the 2010 AGU meeting in San Francisco, and the University of South Florida, and at the 2011 Eastern APA Author Meets Critics session. Too many helpful suggestions, comments, and criticisms have been made to keep track of. Thanks to Justin Biddle and Johannes Lenhard for working with me on previous projects (see the bibliography) that have contributed immeasurably to my understanding of these topics.

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 The Author(s)

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Winsberg, E. (2018). Communicating Uncertainty to Policymakers: The Ineliminable Role of Values. In: A. Lloyd, E., Winsberg, E. (eds) Climate Modelling. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-319-65058-6_13

Download citation

Publish with us

Policies and ethics