Skip to main content
Log in

Are computer simulations experiments? And if not, how are they related to each other?

  • Original Paper in Philosophy of Science
  • Published:
European Journal for Philosophy of Science Aims and scope Submit manuscript

Abstract

Computer simulations and experiments share many important features. One way of explaining the similarities is to say that computer simulations just are experiments. This claim is quite popular in the literature. The aim of this paper is to argue against the claim and to develop an alternative explanation of why computer simulations resemble experiments. To this purpose, experiment is characterized in terms of an intervention on a system and of the observation of the reaction. Thus, if computer simulations are experiments, either the computer hardware or the target system must be intervened on and observed. I argue against the first option using the non-observation argument, among others. The second option is excluded by e.g. the over-control argument, which stresses epistemological differences between experiments and simulations. To account for the similarities between experiments and computer simulations, I propose to say that computer simulations can model possible experiments and do in fact often do so.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. See Efstathiou et al. (1985) and Bertschinger (1998) and Dolag et al. (2008) for simulations in cosmology.

  2. E.g. Gramelsberger (2010).

  3. E.g. Naumova et al. (2008).

  4. Keller (2003), p. 203, in scare quotes.

  5. Humphreys (1994), p. 103.

  6. Dowling (1999), p. 261, in scare quotes.

  7. Authors as different as Winsberg (1999), p. 277, Stöckler (2000), p. 366 and Barberousse et al. (2009), pp. 558–559 agree that this is an important task for a philosophical treatment of CS.

  8. Consult (Imbert 2017), Sec. 3.5 for a very useful overview of the recent debate.

  9. But see fn. 68 for a short remark on analog simulations.

  10. I can draw on a rich philosophical literature about experiments. See Hacking (1983), Part B, Janich (1995), Morrison (1998), Heidelberger (2005), Radder (2009), Bogen (2010) and Franklin (2010) for introductory pieces or reviews about scientific experiments. Radder (2003) is a recent collection in the philosophy of experiment. See Falkenburg (2007), particularly Ch. 2, for a recent account of experimentation with applications to particle physics. Biological experiments are philosophically analyzed by Weber (2005), experiments in economics by Sugden (2005). For studies about modern experiments see also Knorr-Cetina (1981) and Rheinberger (1997).

  11. See e.g. Janich (1995) and Parker (2009), p. 487 for the two components of experimentation. Causal interference in experiments is put into a historical perspective by Tiles (1993).

  12. Kant himself is eager to stress the conceptual work necessary to ask nature a well-defined question, but this is not important in what follows; our focus concerning the idea that nature is asked a question is rather on the causal interference with the system that is observed.

  13. See Hüttemann (2000) and Falkenburg (2007), Ch. 2 for a related discussion.

  14. In the example of the potter, I cannot exclude that the potter runs an experiment, if additional conditions are fulfilled. But even if she does, the respective experiment would not count as scientific. Scientific experiments are embedded in a broader scientific practice. As a consequence, the epistemic difference an experiment is supposed to make is more pronounced.

  15. See Guala (2002) and Winsberg (2009b), pp. 52–53.

  16. See e.g. Balzer (1997), p. 139.

  17. See Shapere (1982) and Hacking (1983), Chs. 10–11, Falkenburg (2007), pp. 65–71 and Humphreys (2004), Ch. 1 for broad accounts of observation.

  18. A recent monograph about theory-ladenness is Adam (2002).

  19. See Peschard (forthcoming), Sec. 1 for a similar account of experiment.

  20. For instance, Zimmerman (2003) calls his study a natural experiment. See Brown and Fehige (2017) for a recent overview of thought experiments

  21. One may of course argue that natural and thought experiments are not really experiments, but this is not the place to do so.

  22. I’m grateful to an anonymous referee for pointing me to this fact.

  23. See Barberousse et al. (2009) and Humphreys (2013) for useful discussions of the notion of data in the context of CSs.

  24. Cf. the “identity thesis” mentioned by Winsberg (2009a), p. 840.

  25. Similar claims as CE and CE+ figure in J. Norton’s reduction of thought experiments to arguments, see Norton (1996); cf. Beisbart (2012).

  26. So far, the focus of the philosophy of computer simulations was on knowledge.

  27. To be fair, I should mention that the paper by Morrison provides also indications that she does not fully support CET. For one thing, the wordings of her central claims are very cautious; she never says that CSs are experiments, but rather e.g. that there are no reasons to maintain the distinction (e.g. p. 55). This claim seems also to be restricted to some “contexts” (p. 33). She further admits that computer simulations do not involve the manipulation of the target system (fn. 16 on p. 55). This concession does not seem to matter much for her argument; so, maybe, she does not think that intervention is crucial for experiment.

  28. CE, CE+ and CME try to clarify the relationship between experiments and simulations. But what exactly do they mean by CSs? There are broadly two ways of conceiving CSs depending on whether a CS is supposed to be one run with a simulation program or whether it is what Parker (2009), p. 488 calls a “computer simulation study”, which also includes writing the program, testing it, etc. (e.g. Frigg and Reiss 2009, p. 596; Parker 2009, p. 488). Analogous questions can be raised about experiments too, e.g. is the construction of the detector used in an experiment part of the latter or not? In what follows, I will not rely upon any specific proposal as to what is included in an experiment or a CS. Rather, I will assume that experiments and CSs are identified in a similar way such that the claims under consideration have a chance of being true. For instance, when we discuss CE, it would be too uncharitable to assume that experiments include detector building and similar activities, while a computer simulation is simply one run of a simulation program. What is important though for my argument is that every experiment includes an intervention on the object of the experiment.

  29. Cf. Hughes (1999), p. 137.

  30. Note that we are here not talking about observing in the sense of looking at. Observation on this interpretation does not suffice for experimenting.

  31. See Rechenberg (2000), Chs. 2–3 for a brief description of the hardware of computers; the details do not matter for my purposes.

  32. Barberousse et al. (2009) seem to agree with my claim that the working scientist does not observe the hardware, for they write that

    “the running computer is not observed qua physical system, but qua computer, i.e., qua symbolic system” (p. 564).

  33. I’m grateful to an anonymous referee for raising this objection.

  34. See Press et al. (2007), p. 9 for an example of a suitable program.

  35. Such an argument is suggested by Imbert (2017), Sec. 3.5.

  36. This is so if intervention in the second condition is meant to be intentional. If this is not so, then the argument needs the third condition which makes it very likely that the intervention is intentional.

  37. The example of a simulation that is parallelized is also used by Barberousse et al. (2009, p. 565), albeit in a different argument.

  38. My arguments against CEH from this section can easily be generalized to show that computations carried out on computers (and not just simulations) do not include an experiment on the hardware.

  39. Some measurement apparatuses used in experiments function in a similar fashion. We do not observe certain measurement apparatuses, but rather use them to observe something else (cf. fn. 2 on p. 7 above). To do so we have to trust the instruments. There is thus a close parallel between computers as instruments and instruments in experimentation. Cf. Humphreys (2004), Ch. 1 and Parker (2010).

  40. For the purposes of this paper, we need not engage with Morrison’s argument in detail because Giere (2009) has made a convincing case against it.

  41. My first two arguments against CET may be summarized by the claim by Arnold (2013, pp. 58–59) that CSs do not operate on the target itself.

  42. Here, the last inferential step excludes the possibility that CSs include an experiment on the target even though the results of CSs are not experimental. This possibility can indeed be dismissed as being far-fetched and useless. Even if it were realized, we could not directly appeal to experimentation to explain the epistemic power of CSs.

  43. Properly speaking, it is the results of CSs (or experiments) that are (not) over-controlled; but for convenience, I will sometimes say that CSs (experiments) are (not) over-controlled. See fn. 42 for a justification.

  44. To elaborate a bit: The argument starts from what the working scientists wants. The reason is that the third condition on experiment is cast in terms of what the experimenter wants; it does not require that the experimenter be successful. The argument further assumes that the scientist is rational in that she only wants something that she takes to be possible and that she draws immediate consequences of her beliefs. For the objective absence of over-control, it is further assumed that the scientist correctly believes that she can learn about a reaction of the system. We can grant the additional assumptions because proponents of CET will not want to save their claim by claiming that scientists are not rational or that they do not know basic things about the concrete setting. If we don’t find the additional assumptions convincing, we may restrict ourselves to successful experiments in which the goal mentioned in the third condition is in fact fulfilled. Proponents of CET will not want to exclude such experiments. We can then argue that over-control would prevent success.

  45. Note though that the notion of a reaction does a lot of work for the argument since we understand reaction in a way that excludes objective over-control. It may be objected that, in the conditions on experiment, “reaction” may instead be taken to be any consequence of the intervention. The assumption that experiments exclude over-control would then need additional justification. I think that the discussion provided in this section does provide this justification.

  46. I’m grateful to an anonymous referee for raising this objection.

  47. There may be exceptions. For instance, I may use a system that is known to follow certain dynamical equations to identify a solution to these equations. Here I need to know the equations, otherwise I can’t interpret the experiment in the way I wish to. To save my claim that experiments are not over-controlled, I may either deny that we are really talking about an experiment here. Alternatively, I may claim that knowledge of the equations is not needed for the experiment proper, but only for an inference that is based upon it.

  48. See e.g. Beisbart and Norton (2012).

  49. This is also what Hughes (1999), p. 142 claims. I am here speaking of combinations of assumptions because Duhem (1954, Ch. 10, §§2–3) has taught us that many hypotheses cannot be tested or refuted in isolation.

  50. Here, simulations are distinguished from arguments to the effect that the simulations get it right. When arguments of this type are part and parcel of simulations, the latter can of course empirically refute a set of assumptions, as can do experiments. But it is trivial that CSs in this extremely thick sense can do this. This trivial point cannot be the rationale for CET. – Note also that purely mathematical theories may be falsified using computer simulations. But mathematical theories are not my concern here.

  51. What then are the conditions under which a CS can replace an experiment? Well, it can do so if the CS is known to reflect the intervention and the reaction of the system experimented on in a sufficiently faithful way. What sufficiently faithful representation means depends on the aspects that the working scientists are interested in and on the desired level of accuracy.

  52. Barberousse et al. (2009), pp. 562 and 565 make a similar point concerning the hardware of a computer. Their target is not so much an alleged experimental status of simulations rather than the idea that the physicality of the simulations is crucial for the epistemic power of simulations.

  53. That CSs model experiments is sometimes assumed in the sciences, too, see e.g. Haasl and Payseur (2011), p. 161. The claim also makes sense of the following remark by Metropolis and Ulam (1949) about Monte Carlo simulations:

    “These experiments will of course be performed not with any physical apparatus, but theoretically.” (p. 337).

  54. As I shall show below, my argument also goes through for an alternative account of modeling that does not assume similarity.

  55. See also Beisbart (2014) for the various ways in which computer simulations are related to models.

  56. This analysis is focused on deterministic simulations, but can be generalized to Monte-Carlo simulations. The latter produce many sample trajectories of the computer model. Each sample trajectory arises from initial conditions subject to quasi-intervention and produces outputs that may be quasi-observed.

  57. I have here concentrated on the conditions on experimentation introduced in Section 2 above. These conditions have not been shown to be sufficient. This is not a problem because we here not interested in the claim that an experiment is indeed run, but only in the proposition that an experiment is modeled. Now a model need not fully reflect its target. Thus, not every condition on experiment needs to be reflected in the model, if only crucial aspects of experiments are represented. This clearly seems to be the case.

  58. A model in which all initial conditions and parameter values are fixed seems highly artificial, and even in this case, one may vary some of the model assumptions, which would suffice for quasi-intervention.

  59. My claim that a CS can be or even is a modeled experiment is not meant to imply that this CS can be or is an experiment. A modeled experiment is not an experiment as fake snow is not snow.

  60. If a CS does not actually model a possible experiment because it is supposed to reflect the way the target system does behave as a matter of fact, we may still say that it models a natural experiment. In such an experiment, no intervention is needed simply because the system that is observed happens to fulfill the conditions that are at the focus of the inquiry.

  61. Some similarities on the list from Section 3 too apply to CSs of which CME2 does not hold true. We can explain such similarities by saying that the simulations model only the behavior of a system as a reaction to certain conditions rather than also an intervention in which the system is subjected to the conditions.

  62. See e.g. Mainzer (1995), p. 467.

  63. It would be too much to claim that my proposal provides an independent explanation of the similarities listed. The reason is that some similarities are built into the proposal. What the proposal does though is to re-organize the similarities in a useful way. Of course, CE (plus, maybe, CE+) potentially reorganizes the similarities too, but CE and CE+ have been rejected on independent grounds.

  64. They may do so by imitating natural experiments, which are not experiments according to our partial explication.

  65. See also Imbert (2017), Sec. 5.2 for a similar strategy.

  66. All this was very helpfully pointed out by an anonymous referee who also noted that the problem is not just restricted to my account, but rather affects any view that embraces the following two claims: i. Experiments are not over-controlled, while CSs are. ii. CSs can replace experiments.

  67. In a similar way, Nagel (1986), p. 93 criticizes Berkeley’s so-called master argument because Berkeley confuses something that is needed to create an image with the content of the image.

  68. Can my main thesis, viz. that CSs can, and do often do, model possible experiments be generalized to what is called analog simulation? I here take it that, in an analog simulation, the target and a physical model of it may be described using the same type of dynamical equations. The dynamics of the model then is investigated to learn about the dynamics of the target (Trenholme 1994). For instance, electric circuits may be used to study a fluid in this way (see Kroes (1989) for this example). Now, my claim that CSs can model possible experiments and often do so seems to apply to such analog simulations, too. For instance, if I set up an electric circuit to model a particular fluid, I’m modeling an experiment on the fluid. Note, however, the following difference between analog and computer simulations: If a possible experiment on the target is modeled in an analog simulation, this is a real experiment on the model system (the analogue). As I have argued above in Section 5, this is not so in computer simulations.

References

  • Adam, M. (2002). Theoriebeladenheit und Objektivität. Zur Rolle von Beobachtungen in den Naturwissenschaften. Frankfurt am Main und London: Ontos.

    Book  Google Scholar 

  • Arnold, E. (2013). Experiments and simulations: Do they fuse? In Durán, J.M., & Arnold, E. (Eds.) Computer simulations and the changing face of scientific experimentation (pp. 46–75). Newcastle upon Tyne: Cambridge Scholars Publishing.

  • Balzer, W. (1997). Die Wissenschaft und ihre Methoden. Freiburg und München: Karl Alber.

    Google Scholar 

  • Barberousse, A., Franceschelli, S., & Imbert, C. (2009). Computer simulations as experiments. Synthese, 169, 557–574.

    Article  Google Scholar 

  • Barker-Plummer, D. (2016). Turing machines. In Zalta, E.N. (Ed.), The Stanford encyclopedia of philosophy. Winter 2016 edn, Metaphysics Research Lab, Stanford University.

    Google Scholar 

  • Baumberger, C. (2011). Understanding and its relation to knowledge. In Löffler, C.J.W. (Ed.) Epistemology: contexts, values, disagreement. Papers of the 34th international Wittgenstein symposium (pp. 16–18). Austrian Ludwig Wittgenstein Society.

  • Beisbart, C. (2012). How can computer simulations produce new knowledge? European Journal for Philosophy of Science, 2(2012), 395–434.

  • Beisbart, C. (2014). Are we Sims? How computer simulations represent and what this means for the simulation argument. The Monist, 97/3, 399–417.

    Article  Google Scholar 

  • Beisbart, C., & Norton, J.D. (2012). Why Monte Carlo simulations are inferences and not experiments. International Studies in the Philosophy of Science, 26, 403–422.

    Article  Google Scholar 

  • Bertschinger, E. (1998). Simulations of structure formation in the Universe. Annual Review of Astronomy and Astrophysics, 36, 599–654.

    Article  Google Scholar 

  • Binder, K., & Heermann, D. (2010). Monte Carlo simulation in statistical physics: An introduction, graduate texts in physics. Berlin: Springer Verlag.

    Book  Google Scholar 

  • Bogen, J. (2010). Theory and observation in science. In Zalta, E.N. (Ed.), The stanford encyclopedia of philosophy. Spring 2010 edn. http://plato.stanford.edu/archives/spr2010/entries/science-theory-observation/.

    Google Scholar 

  • Brown, J.R., & Fehige, Y. (2017). Thought experiments. In Zalta, E.N. (Ed.), The stanford encyclopedia of philosophy. Summer 2017 edn.

    Google Scholar 

  • Carnap, R. (1962). Logical foundations of probability, 2nd edn. Chicago: University of Chicago Press.

    Google Scholar 

  • Casti, J.L. (1997). Would-be worlds. How simulation is changing the frontiers of science. New York: Wiley.

    Google Scholar 

  • Dolag, K., Borgani, S., Schindler, S., Diaferio, A., & Bykov, A.M. (2008). Simulation techniques for cosmological simulations. Space Science Reviews, 134, 229–268. arXiv:0801.1023v1.

    Article  Google Scholar 

  • Dowling, D. (1999). Experimenting on theories. Science in Context, 12/2, 261–273.

    Google Scholar 

  • Duhem, P.M.M. (1954). The aim and structure of physical theory, Princeton science library. Princeton, NJ: Princeton University Press.

    Google Scholar 

  • Durán, J.M. (2013). The use of the materiality argument in the literature on computer simulations. In Durán, J.M., & Arnold, E. (Eds.), Computer simulations and the changing face of scientific experimentation (pp. 76–98). Newcastle upon Tyne: Cambridge Scholars Publishing.

    Google Scholar 

  • Efstathiou, G., Davis, M., White, S.D.M., & Frenk, C.S. (1985). Numerical techniques for large cosmological N-body simulations. Ap J Suppl, 57, 241–260.

    Article  Google Scholar 

  • Falkenburg, B. (2007). Particle metaphysics. A critical account of subatomic reality. Heidelberg: Springer.

    Google Scholar 

  • Franklin, A. (2010). Experiment in physics. In Zalta, E.N. (Ed.), The Stanford encyclopedia of philosophy. Spring 2010 edn.

    Google Scholar 

  • Frigg, R.P., & Reiss, J. (2009). The philosophy of simulation: Hot mew issues or same old stew? Synthese, 169, 593–613.

  • Fritzson, P. (2004). Principles of object-oriented modeling and simulation with Modelica 2.1. IEEE Press.

  • Giere, R.N. (2004). How models are used to represent. Philosophy of Science, 71, 742–752.

    Article  Google Scholar 

  • Giere, R.N. (2009). Is computer simulation changing the face of experimentation? Philosophical Studies, 143(1), 59–62.

  • Gillespie, D.T. (1976). A general method for numerically simulating the stochastic time evolution of coupled chemical reactions. Journal of Computational Physics, 22, 403–434.

    Article  Google Scholar 

  • Goodman, N. (1968). Languages of art: An approach to a theory of symbols. Indianapolis: Bobbs-Merrill.

    Google Scholar 

  • Gramelsberger, G. (2010). Computerexperimente. Zum Wandel der Wissenschaft im Zeitalter des Computers. Transcript, Bielefeld.

  • Guala, F. (2002). Models, simulations, and experiments. In Magnani, L., & Nersessian, N. (Eds.), Model-based reasoning: science, technology, values (pp. 59–74). New York: Kluwer.

    Chapter  Google Scholar 

  • Guillemot, H. (2010). Connections between simulations and observation in climate computer modeling. scientist’s practices and bottom-up epistemology lessons. Studies In History and Philosophy of Science Part B: Studies In History and Philosophy of Modern Physics, 41, 242–252. Special Issue: Modelling and simulation in the atmospheric and climate sciences.

    Article  Google Scholar 

  • Haasl, R.J., & Payseur, B.A. (2011). Multi-locus inference of population structure: a comparison between single nucleotide polymorphisms and microsatellites. Heredity, 106, 158–171.

    Article  Google Scholar 

  • Hacking, I. (1983). Representing and intervening. Cambridge: Cambridge University Press.

    Book  Google Scholar 

  • Hasty, J., McMillen, D., Isaacs, F., & Collins, J.J. (2001). Computational studies of gene regulatory networks: In numero molecular biology. Nature Reviews Genetics, 2, 268–279.

    Article  Google Scholar 

  • Heidelberger, M. (2005). Experimentation and instrumentation. In Borchert, D. (Ed.), Encyclopedia of philosophy. Appendix (pp. 12–20). New York: Macmillan.

    Google Scholar 

  • Hughes, R.I.G. (1997). Models and representation. Philosophy of Science (Proceedings), 64, S325–S336.

    Article  Google Scholar 

  • Hughes, R.I.G. (1999). The Ising model, computer simulation, and universal physics. In Morgan, M.S., & Morrison, M. (Eds.), Models as mediators. Perspectives on natural and social sciences (pp. 97–145). Cambridge: Cambridge University Press.

    Chapter  Google Scholar 

  • Humphreys, P. (1990). Computer simulations. PSA: Proceedings of the Biennial Meeting of the Philosophy of Science Association, 1990, 497–506.

    Google Scholar 

  • Humphreys, P. (1994). Numerical experimentation. In Humphreys, P. (Ed.), Patrick Suppes. Scientific philosopher (Vol. 2, pp. 103–118). Dordrecht: Kluwer.

  • Humphreys, P. (2004). Extending ourselves: Computational science, empiricism, and scientific method. New York: Oxford University Press.

    Book  Google Scholar 

  • Humphreys, P.W. (2013). What are data about? In Durán, J.M., & Arnold, E. (Eds.) Computer simulations and the changing face of scientific experimentation (pp. 12–28). Newcastle upon Tyne: Cambridge Scholars Publishing.

  • Hüttemann, A. (2000). Natur und Labor. Über die Grenzen der Gültigkeit von Naturgesetzen. Philosophia Naturalis, 37, 269–285.

    Google Scholar 

  • Imbert, C. (2017). Computer simulations and computational models in science. In Magnani, L. & Bertolotti, T. (Eds.) Springer handbook of model-based science (Vol. 34, pp. 733–779), Cham, chapter: Springer.

  • Janich, P. (1995). Experiment. In Mittelstraß, J. (Ed.), Enzyklopädie Philosophie und Wissenschaftstheorie. Band 1, Metzler, Stuttgart (pp. 621–622).

    Google Scholar 

  • Kant, I. (1998). Critique of pure reason. Cambridge: Cambridge University Press. translated by P. Guyer and A. W. Wood; Cambridge Edition of the Works of Kant.

    Book  Google Scholar 

  • Keller, E.F. (2003). Models, simulation, and computer experiments. In Radder, H. (Ed.), The philosophy of scientific experimentation (pp. 198–215). Pittsburgh: University of Pittsburgh Press.

    Chapter  Google Scholar 

  • Knorr-Cetina, K. (1981). The manufacture of knowledge: An essay on the constructivist and contextual nature of science. Pergamon international library of science, technology, engineering, and social studies, Pergamon Press.

  • Kroes, P. (1989). Structural analogies between physical systems. British Journal for the Philosophy of Science, 40, 145–154.

    Article  Google Scholar 

  • Küppers, G., & Lenhard, J. (2005). Computersimulationen: Modellierungen 2. Ordnung. Journal for General Philosophy of Science, 36(2), 305–329.

    Article  Google Scholar 

  • Lim, S., McKee, J.L., Woloszyn, L., Amit, Y., Feedman, D.J., Sheinberg, D.L., & Brunel, N. (2015). Inferring learning rules from distributions of firing rates in cortical neurons. Nature Neuroscience, 18, 1804–1810.

    Article  Google Scholar 

  • Mainzer, K. (1995). Computer – neue Flügel des Geistes? Die Evolution computergestützter Technik, Wissenschaft, Kultur und Philosophie, 2nd edn. Berlin, New York: de Gruyter Verlag.

    Google Scholar 

  • Metropolis, N., & Ulam, S. (1949). The Monte Carlo method. Journal of the American Statistical Association, 44(247), 335–341.

    Article  Google Scholar 

  • Michelson, A.A. (1881). The relative motion of the earth and the luminiferous ether. American Journal of Science, 22, 120–129.

    Article  Google Scholar 

  • Michelson, A.A., & Morley, E.W. (1887). On the relative motion of the earth and the luminiferous ether. American Journal of Science, 34, 333–345.

    Article  Google Scholar 

  • Morgan, M.S. (2002). Model experiments and models in experiments. In Magnani, L., & Nersessian, N. (Eds.), Model-based reasoning: science, technology, values (pp. 41–58). New York: Kluwer.

    Chapter  Google Scholar 

  • Morgan, M.S. (2003). Experimentation without material intervention: Model experiments, virtual experiments, and virtually experiments. In Radder, H. (Ed.), The philosophy of scientific experimentation (pp. 216–235). Pittsburgh: University of Pittsburgh Press.

    Chapter  Google Scholar 

  • Morgan, M.S. (2005). Experiments versus models: New phenomena, inference and surprise. Journal of Economic Methodology, 12(2), 317–329.

    Article  Google Scholar 

  • Morrison, M. (1998). Experiment. In Craig, E. (Ed.) Routledge encyclopedia of philosophy (Vol. III, pp. 514–518). London: Routledge and Kegan.

  • Morrison, M. (2009). Models, measurement and computer simulation: The changing face of experimentation. Philosophical Studies, 143, 33–57.

    Article  Google Scholar 

  • Nagel, T. (1986). The view from nowhere. Oxford: Oxford University Press.

    Google Scholar 

  • Naumova, E.N., Gorski, J., & Naumov, Y.N. (2008). Simulation studies for a multistage dynamic process of immune memory response to influenza: Experiment in silico. Annales Zoologici Fennici, 45, 369–384.

    Article  Google Scholar 

  • Norton, J.D. (1996). Are thought experiments just what you thought? Canadian Journal of Philosophy, 26, 333–366.

  • Norton, S.D., & Suppe, F. (2001). Why atmospheric modeling is good science. In Edwards, P., & Miller, C. (Eds.), Changing the atmosphere (pp. 67–106). Cambridge, MA: MIT Press.

    Google Scholar 

  • Parker, W.S. (2008). Franklin, Holmes, and the epistemology of computer simulation. International Studies in the Philosophy of Science, 22(2), 165–183.

    Article  Google Scholar 

  • Parker, W.S. (2009). Does matter really matter? Computer simulations, experiments, and materiality. Synthese, 169(3), 483–496.

    Article  Google Scholar 

  • Parker, W.S. (2010). An instrument for what? Digital computers, simulation and scientific practice. Spontaneous Generations, 4(1), 39–44.

    Google Scholar 

  • Peschard, I. (forthcoming). Is simulation a substitute for experimentation? In Vaienti, S., & Livet, P. (Eds.) Simulations and networks. Aix-Marseille: Presses Universitaires d’Aix-Marseille. Here quoted after the preprint http://d30056166.purehost.com/Is_simulation_an_epistemic%20_substitute.pdf.

  • Press, W.H., Teukolsky, S.A., Vetterling, W.T., & Flannery, B.P. (2007). Numerical recipes. The art of scientific computing, 3rd edn. New York: Cambridge University Press.

    Google Scholar 

  • Radder, H. (2009). The philosophy of scientific experimentation: A review. Automatic Experimentation 1. open access; http://www.aejournal.net/content/1/1/2.

  • Radder, H. (Ed.) (2003). The philosophy of scientific experimentation. Pittsburgh: University of Pittsburgh Press.

  • Rechenberg, P. (2000). Was ist Informatik? Eine allgemeinverständliche Einführung, 3rd edn. München: Hanser.

    Google Scholar 

  • Rheinberger, H.J. (1997). Toward a history of epistemic things: Synthesizing proteins in the test tube. Writing science, Stanford University Press.

  • Rohrlich, F. (1990). Computer simulation in the physical sciences. PSA: Proceedings of the Biennial Meeting of the Philosophy of Science Association, 1990, 507–518.

    Google Scholar 

  • Scholz, O. R. (2004). Bild, Darstellung, Zeichen. Philosophische Theorien bildlicher Darstellung, 2nd edn. Frankfurt am Main: Vittorio Klostermann.

    Google Scholar 

  • Shapere, D. (1982). The concept of observation in science and philosophy. Philosophy of Science, 49(4), 485–525.

    Article  Google Scholar 

  • Skaf, R.E., & Imbert, C. (2013). Unfolding in the empirical sciences: experiments, thought experiments and computer simulations. Synthese, 190(16), 3451–3474.

    Article  Google Scholar 

  • Stöckler, M. (2000). On modeling and simulations as instruments for the study of complex systems. In Carrier, M., Massey, G.J., & Ruetsche, L. (Eds.), Science at the century’s end: Philosophical questions on the progress and limits of science (pp. 355–373). Pittsburgh, PA: University of Pittsburgh Press.

    Chapter  Google Scholar 

  • Suárez, M. (2003). Scientific representation: Against similarity and isomorphism. International Studies in the Philosophy of Science, 17, 225–244.

    Article  Google Scholar 

  • Suárez, M. (2004). An inferential conception of scientific representation. Philosophy of Science, 71, 767–779.

    Article  Google Scholar 

  • Sugden, R. (Ed.) (2005). Experiment, theory, world: A symposium on the role of experiments in economics, Vol. 12/2. London: Routledge. Special issue of Journal of Economic Methodology.

  • Tiles, J.E. (1993). Experiment as intervention. British Journal for the Philosophy of Science, 44(3), 463–475.

    Article  Google Scholar 

  • Trenholme, R. (1994). Analog simulation. Philosophy of Science, 61(1), 115–131.

    Article  Google Scholar 

  • Turing, A. (1937). On computable numbers, with an application to the entscheidungsproblem, Proceedings of the London mathematical society (Vol. s2–42, no. 1).

    Google Scholar 

  • Weber, M. (2005). Philosophy of experimental biology. Cambridge: Cambridge University Press.

    Google Scholar 

  • Weisberg, M. (2007). Who is a modeler? British Journal for Philosophy of Science, 58, 207–233.

  • Winsberg, E. (1999). Sanctioning models. The epistemology of simulation. Science in Context, 12, 275–292.

    Article  Google Scholar 

  • Winsberg, E. (2003). Simulated experiments: Methodology for a virtual world. Philosophy of Science, 70, 105–125.

    Article  Google Scholar 

  • Winsberg, E. (2009a). Computer simulation and the philosophy of science. Philosophy Compass, 4/5, 835–845.

    Article  Google Scholar 

  • Winsberg, E. (2009b). A tale of two methods. here quoted from Winsberg (2010) Ch. 4 pp. 49–71.

  • Winsberg, E. (2010). Science in the age of computer simulations. Chicago: University of Chicago Press.

    Book  Google Scholar 

  • Zimmerman, D.J. (2003). Peer effects in academic outcomes: Evidence from a natural experiment. The Review of Economics and Statistics, 85(1), 9–23.

    Article  Google Scholar 

Download references

Acknowledgments

Thanks to Christoph Baumberger and Trude Hirsch Hadorn for extremely useful comments on an earlier version of this manuscript. I’m also very grateful for detailed and helpful comments and criticisms by two anonymous referees. One of them provided extensive, constructive and extremely helpful comments even about a revised version of this paper – thanks a lot for this!

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Claus Beisbart.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Beisbart, C. Are computer simulations experiments? And if not, how are they related to each other?. Euro Jnl Phil Sci 8, 171–204 (2018). https://doi.org/10.1007/s13194-017-0181-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13194-017-0181-5

Keywords

Navigation