Abstract
This chapter has explored a variety of ways that ethical and societal values associated with environmental policy making move “upstream” into the practice of policy-relevant scientific research. In the case of nanotoxicology, researchers face value-laden decisions about what materials to study, what biological models to employ, which effects to examine, and what standards of evidence to demand. Depending on how these choices are made, they can support the interests of those who want to aggressively protect environmental and public health, or they can benefit the regulated industries that are trying to market new products. In order to incorporate more effective ethical and societal reflection on these decisions, the chapter suggests developing socially-sensitive research-ethics training, developing appropriate forms of deliberation, and strategically investing in independently funded research.
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsNotes
- 1.
I should emphasize that, while Barrett and Raffensperger do an admirable job of highlighting the implicit value judgments that can permeate scientific research, their proposal of a sharp distinction between “mechanistic science” and “precautionary science” is questionable. Particular research practices can arguably be classified as precautionary only relative to a particular context (including, for example, the threats that are under consideration, the preventive actions being considered in response to the threats, and an alternative set of research practices that are less precautionary).
- 2.
It is important to recognize that advocates of the precautionary principle are by no means the only thinkers who have studied how scientific practices can privilege some ethical or societal values over others. I have focused on this particular group of thinkers because they have done a good job of highlighting the value-ladenness of scientific research and because their concerns apply well to nanotoxicology.
- 3.
It is worth emphasizing that even decisions about whether to emphasize in vitro, in vivo, or in silico experimental systems involve a wide range of value judgments about how to prioritize considerations like the speed of research, avoidance of false positive and false negative errors, expense, and animal welfare.
- 4.
Regarding the sensitivity of biological models, Tom Chandler (personal communication, 2009) provides a good example. He notes that daphnia and copepods are both small crustaceans that are used for studying the effects of environmental toxicants. Daphnia have been used more frequently, in part because they have generally been more convenient to study and to grow in the laboratory. Nevertheless, copepods tend to be more sensitive to some toxicants. Regarding the ethics of animal experimentation, Lafollette and Shanks (1997) provide an excellent overview of the issues. In some cases, computer modeling and bioinformatics may enable researchers to identify potential threats more quickly and with less harm to animal welfare than by using traditional in vivo approaches.
- 5.
For more information about this program, see http://www.cspo.org/outreach/phdplus/; last accessed on August 19, 2009.
- 6.
For more information, see http://www.cns.ucsb.edu/education/; last accessed on August 19, 2009.
- 7.
For more information about the National Citizens’ Technology Forum, see Philbrick and Barandiaran (2009). The final report for the DEEPEN project is available at http://www.geography.dur.ac.uk/projects/deepen/NewsandEvents/tabid/2903/Default.aspx (last accessed on March 5, 2010), and more information about the Demos project and its nanodialogues is available at http://www.demos.co.uk/ (last accessed on March 5, 2010).
- 8.
Admittedly, government agencies are also influenced by a wide range of values and concerns. The point of promoting government funding is not to remove all value influences from scientific research but rather to counteract the radical, egregious biases associated with much industry-funded research (see McGarity and Wagner 2008; Michaels 2008).
References
Angell, M. 2004. The truth about the drug companies: How they deceive us and what to do about it. New York: Random House.
APHA (American Public Health Association). 2003. Supporting legislation for independent post-marketing phase IV comparative evaluation of pharmaceuticals. Washington, DC: APHA. Available at http://www.apha.org/advocacy/policy/policysearch/default.htm?id=1265. Accessed on 12 Sept 2007.
Balbus, J., et al. 2007. Hazard assessment for nanoparticles: Report from an interdisciplinary workshop. Environmental Health Perspectives 115: 1654–1659.
Barnard, A. 2009. How can ab initio simulations address risks in nanotech? Nature Nanotechnology 4: 332–335.
Barrett, K., and C. Raffensperger. 1999. Precautionary science. In Protecting public health and the environment, ed. C. Raffensperger and J. Tickner, 106–122. Washington, DC: Island Press.
Beierle, T. 2002. The quality of stakeholder-based decisions. Risk Analysis 22: 739–749.
Biello, D. 2006, May 10. Mixing it up, scientific American. Available online at: http://www.sciam.com/article.cfm?id=mixing-it-up. Last accessed on 16 Apr 2009.
Boverhof, D., et al. 2006. Comparative toxicogenomic analysis of the hepatotoxic effects of TCDD in Sprague Dawley rats and C57BL/6 mice. Toxicological Sciences 94: 398–416.
Bowman, D., and G. van Calster. 2008. Flawless or fallible? A review of the applicability of the European Union’s cosmetics directive in relation to nano-cosmetics. Studies in Ethics, Law, and Technology 2: Article 6.
Calow, P., and V. Forbes. 2003. Does ecotoxicology inform ecological risk assessment? Environmental Science and Technology 37: 147A–151A.
Chandler, T., et al. 2004. Population consequences of fipronil and degradates to copepods at field concentrations: An integration of life cycle testing with leslie matrix population modeling. Environmental Science and Technology 38: 6407–6414.
Cranor, C. 1990. Some moral issues in risk assessment. Ethics 101: 123–143.
Cranor, C. 1995. The social benefits of expedited risk assessments. Risk Analysis 15: 353–358.
Cranor, C. 1999. Asymmetric information, the precautionary principle, and burdens of proof. In Protecting public health & the environment: Implementing the precautionary principle, ed. C. Raffensperger and J. Tickner, 74–99. Washington, DC: Island Press.
Douglas, H. 2000. Inductive risk and values in science. Philosophy of Science 67: 559–579.
Douglas, H. 2003. The moral responsibilities of scientists: Tensions between responsibility and autonomy. American Philosophical Quarterly 40: 59–68.
Douglas, H. 2009. Science, policy, and the value-free ideal. Pittsburgh: University of Pittsburgh Press.
Eggen, R., et al. 2004. Challenges in ecotoxicology. Environmental Science and Technology 38: 59A–64A.
Elliott, K. 2008. A case for deliberation in response to hormesis research. Human and Experimental Toxicology 27: 529–538.
Elliott, K. 2011. Is a little pollution good for you? Incorporating societal values in environmental research. New York: Oxford.
Elliott, K., and D. McKaughan. 2009. How values in scientific discovery and pursuit alter theory appraisal. Philosophy of Science 76: 598–611.
Elliott, K., and D. Volz. 2012. Addressing conflicts of interest in nanotechnology oversight: Lessons learned from drug and pesticide safety testing. Journal of Nanoparticle Research 14: 664–668.
Fagin, D., M. Lavelle, and The Center for Public Integrity. 1999. Toxic deception. Monroe: Common Courage Press.
Fiorino, D. 1990. Citizen participation and environmental risk: A survey of institutional mechanisms. Science, Technology, and Human Values 15: 226–243.
Fischer, F. 1993. Citizen participation and the democratization of policy expertise: From theoretical inquiry to practical cases. Policy Sciences 26: 165–187.
Fisher, E. 2007. Ethnographic invention: Probing the capacity of laboratory decisions. NanoEthics 1: 155–165.
Grandjean, P. 2005. Implications of the precautionary principle for public health practice and research. Human and Ecological Risk Assessment 11: 13–15.
Guston, D. 1999. Evaluating the first US consensus conference: The impact of the citizen’s panel on telecommunications and the future of democracy. Science, Technology, and Human Values 24: 451–482.
Heinrich, U., et al. 1995. Chronic inhalation exposure of Wistar rats and 2 different strains of mice to diesel-engine exhaust, carbon-black, and titanium-dioxide. Inhalation Toxicology 7: 533–556.
Hill, A., H. Teraoka, W. Heideman, and R. Peterson. 2005. Zebrafish as a model vertebrate for investigating chemical toxicity. Toxicological Sciences 86: 6–19.
Irvin, R., and J. Stansbury. 2004. Citizen participation in decision making: Is it worth the effort? Public Administration Review 64: 55–65.
Kleinman, D. 2000. Science, technology, and democracy. Albany: SUNY Press.
Kleinman, D. 2005. Science and technology in society: Biotechnology and the internet. Oxford: Blackwell.
Kriebel, D., et al. 2001. The precautionary principle in environmental science. Environmental Health Perspectives 109: 871–876.
Krimsky, S. 2003. Science in the private interest. Lanham: Rowman & Littlefield.
Kuhn, T. 1977. Objectivity, value judgment, and theory choice. In The essential tension, 320–339. Chicago: University of Chicago Press.
Lacey, H. 1999. Is science value free? London: Routledge.
Lacey, H. 2002. The ways in which the sciences are and are not value free. In In the scope of logic, methodology and philosophy of science, vol. 2, ed. P. Gärdenfors, J. Wolénski, and K. Kijania-Placek. Dordrecht: Kluwer.
Lafollette, H., and N. Shanks. 1997. Brute science: Dilemmas of animal experimentation. London: Routledge.
Liu, X., et al. 2009. Differential toxicity of carbon nanomaterials in Drosophila: Larval dietary uptake is benign, but adult exposure causes locomotor impairment and mortality. Environmental Science and Technology 43: 6357–6363.
Longino, H. 1990. Science as social knowledge. Princeton: Princeton University Press.
Maynard, A. 2008. Testimony for the U.S. House of Representatives Committee on Science & Technology, Hearing on the National Nanotechnology Initiative Amendments Act of 2008. Available at: http://democrats.science.house.gov/Media/File/Commdocs/hearings/2008/Full/16apr/Maynard_Testimony.pdf. Last accessed on 20 Aug 2009.
McGarity, T., and W. Wagner. 2008. Bending science: How special interests corrupt public health research. Cambridge, MA: Harvard University Press.
McGregor, J., and J. Wetmore. 2009. Researching and teaching the ethics and social implications of emerging technologies in the laboratory. NanoEthics 3: 17–30.
McMullin, E. 1983. Values in science. In PSA 1982, vol. 2, ed. P. Asquith and T. Nickles, 3–28. East Lansing: Philosophy of Science Association.
Michaels, D. 2008. Doubt is their product: How industry’s assault on science threatens your health. New York: Oxford University Press.
NRC (National Research Council). 1996. Understanding risk: Informing decisions in a democratic society. Washington, DC: National Academy Press.
Oberdörster, G., et al. 2005. Principles for characterizing the potential human health effects from exposure to nanomaterials: Elements of a screening strategy. Particle and Fibre Toxicology 2: 8.
Philbrick, P., and J. Barandiaran. 2009. The national citizens’ technology forum: Lessons for the future. Science and Public Policy 36: 335–347.
Pimple, K. 2002. Six domains of research ethics: A heuristic framework for the responsible conduct of research. Science and Engineering Ethics 8: 191–205.
Poland, C., et al. 2008. Carbon nanotubes introduced into the abdominal cavity of mice show asbestos-like pathogenicity in a pilot study. Nature Nanotechnology 3: 423–428.
Raffensperger, C., and J. Tickner. 1999. Protecting public health & the environment: Implementing the precautionary principle, 51–70. Washington, DC: Island Press.
Ramachandran, G., et al. 2011. Recommendations for oversight of nanobiotechnology: Dynamic oversight for complex and convergent technology. Journal of Nanoparticle Research 13: 1345–1371.
Schmidtz, D. 2001. A place for cost-benefit analysis. Philosophical Issues 11: 148–171.
Service, R. 2008. Report faults U.S. strategy for nanotoxicology research. Science 322: 1779.
Shrader-Frechette, K. 1985. Risk analysis and scientific method: Methodological and ethical problems with evaluating societal hazards. Boston: Kluwer.
Shrader-Frechette, K. 1991. Risk and rationality: Philosophical foundations for populist reforms. Berkeley: University of California Press.
Shrader-Frechette, K. 1993. Consent and nuclear waste disposal. Public Affairs Quarterly 7: 363–377.
Shrader-Frechette, K. 1994. Ethics of scientific research. Lanham: Rowman & Littlefield.
Shrader-Frechette, K. 2007a. Nanotoxicology and ethical conditions for informed consent. NanoEthics 1: 47–56.
Shrader-Frechette, K. 2007b. Taking action, saving lives: Our duties to protect environmental and public health. New York: Oxford University Press.
Steel, D. 2010. Epistemic values and the argument from inductive risk. Philosophy of Science 77: 14–34.
Sunstein, C. 2005. Laws of fear: Beyond the precautionary principle. New York: Cambridge University Press.
Templeton, R., et al. 2006. Life-cycle effects of Single-Walled Carbon Nanotubes (SWNTs) on an estuarine meiobenthic copepod. Environmental Science and Technology 40: 7387–7393.
The Royal Society and the Royal Academy of Engineering, UK. 2004. Nanoscience and nanotechnologies. Available at: http://www.nanotec.org.uk/finalReport.htm.
Tickner, J. 2005. Commentary: Barriers and opportunities to changing the research agenda to support precaution and primary prevention. Human and Ecological Risk Assessment 11: 221–234.
Volz, D., and K. Elliott. 2012. Mitigating conflicts of interest in chemical safety testing. Environmental Science and Technology 46: 7937–7938.
vom Saal, F., and C. Hughes. 2005. An extensive new literature concerning low-dose effects of bisphenol a shows the need for a new risk assessment. Environmental Health Perspectives 113: 926–933.
von Schomberg, R. 2006. The precautionary principle and its normative challenges. In Implementing the precautionary principle: Perspectives and prospects, ed. E. Fisher, J. Jones, and R. von Schomberg, 19–41. Northampton: Edward Elgar.
Wahlström, B. 1999. The precautionary approach to chemicals management: A Swedish perspective. In Protecting public health & the environment: Implementing the precautionary principle, ed. C. Raffensperger and J. Tickner, 51–70. Washington, DC: Island Press.
Acknowledgments
I thank Tara Sabo-Attwood and Tom Chandler for very helpful scientific input and examples. This work was supported by the U.S. National Science Foundation under Grant No. 0809470. Any opinions, findings, conclusion, or recommendations expressed in this material are those of the author and do not necessarily reflect the views of the National Science Foundation.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer Science+Business Media Dordrecht
About this chapter
Cite this chapter
Elliott, K.C. (2014). Ethical and Societal Values in Nanotoxicology. In: Gordijn, B., Cutter, A. (eds) In Pursuit of Nanoethics. The International Library of Ethics, Law and Technology, vol 10. Springer, Dordrecht. https://doi.org/10.1007/978-1-4020-6817-1_10
Download citation
DOI: https://doi.org/10.1007/978-1-4020-6817-1_10
Published:
Publisher Name: Springer, Dordrecht
Print ISBN: 978-1-4020-6816-4
Online ISBN: 978-1-4020-6817-1
eBook Packages: Humanities, Social Sciences and LawPhilosophy and Religion (R0)