Twenty-first Century Governance Challenges in the Life Sciences

  • Tatyana NovossiolovaEmail author
Part of the Global Issues book series (GLOISS)


The chapter explores the rapid advancement of biotechnology over the past few decades, outlining an array of factors that drive innovation and, at the same time, raise concerns about the extent to which the scope and pace of novel life science developments can be adequately governed. From ‘dual-use life science research of concern’ through the rise of amateur biology to the advent of personalised medicine, the chapter exposes the limitations of the existing governance mechanisms in accommodating the multifaceted ethical, social, security, and legal concerns arising from cutting-edge scientific and technological developments.

Biotechnology Advancement in the Twenty-First Century

Depending on its application, biotechnology falls into three broad categories: ‘red’ biotechnology encompassing R&D in medical and healthcare sectors (e.g. drug development, disease diagnostics, prevention and treatment); ‘green’ biotechnology related to agriculture (e.g. increasing plant resilience to drought, herbicides and pesticides); and ‘white’ biotechnology covering innovation for industrial purposes (e.g. environment-friendly products).1 The expansion of all three types of biotechnology over the past several decades has been truly breathtaking, both in qualitative and quantitative terms. Forty years ago scientists were fascinated to manipulate the manifestations of life by dint of gene-splicing, while the tools and technologies available in the beginning of twenty-first century have enabled them to create life forms from scratch.2 Similarly, when initially conceived the Human Genome Project (HGP) seemed a daunting undertaking, but within less than ten years of its completion the areas of genome-based diagnostics and therapeutics are now growing at a remarkable pace; and if once cutting-edge life science research used to be confined to prestigious universities and state-of-the-art laboratories found in the highly industrialised countries in the global North, nowadays studies involving highly dangerous microbes are conducted in research facilities scattered around the globe from Indonesia and Vietnam through Kenya and Morocco to Moldova and Pakistan. One authoritative high-level review has even gone so far as to suggest that the ‘life sciences knowledge, materials and technologies are advancing worldwide with Moore’s Law-like speed.’3 And whilst some commentators have questioned the extent to which the on-going progress of biotechnology has translated into practical applications and novel products,4 there is some consensus that the biotechnology landscape has been fundamentally transformed over the recent decades with the possibilities now unlocked holding revolutionary potential. Indeed, rapid advances in the field have produced a knowledge base and set of tools and techniques that enable biological processes to be understood, manipulated and controlled to an extent never possible before5; they have found various applications in numerous spheres of life, generating enormous benefits and offering bright prospects for human betterment; and they have come to be regarded as a key driver of economic development with potential to close the gap between resource-rich and resource-poor countries.6

The progress of biotechnology has been largely driven by three sets of forces, namely social, political and economic.7 The social dynamics at work in this context are understood as the efforts to improve public health and overall wellbeing of individuals both in the global North and global South, boost agricultural yields and encourage environment-friendly practices to mitigate the adverse effects of climate change. Several factors account for the significant value attached to the life sciences in the context of intense globalisation and continuous change. Surging population numbers and extended life expectancy are augmenting the demand for developing effective and affordable medications, novel approaches for the treatment of chronic diseases and additional cost-effective sources of energy and food production. At the same time, rising global trade and travel, coupled with increased urbanisation, and an uneven distribution of wealth are creating optimal conditions for disease outbreaks, pandemics and environmental degradation.8 Against this backdrop, biotechnology appears full of promise and critical to tackling social and natural concerns; enhancing disease prevention, preparedness and surveillance; promoting development; and alleviating human suffering.9

Economic dynamics include national expenditure on research and development, purchasing power, trends in consumerism and market pressures and fluctuations. Besides public funding for R&D which remains a key factor in the growth and flourishing of bioindustry in developed and emerging economies alike, private investment from venture capital firms, start-up companies and transnational corporations (TNCs) have also played an indispensable role in capturing new markets and further facilitating the extension of bioeconomy on a global scale. DuPont’s significant footprint in India is indicative in this regard, not least because of the depth and diversity of the activity that the company has undertaken via its offshore R&D centres ranging from crop science to biofuels.10 Likewise, Merck has outlined a 1.5 billion dollar commitment to expand R&D in China, as part of which it intends to establish an Asia headquarters for innovative drug discovery in Beijing.11

Political dynamics are triggered by states’ increasing commitment to support the progress of biotechnology as a way of maximising their power and boosting their status in the international arena.12 In the aftermath of 9/11 and the ‘Anthrax letters’ attack of October 2001, substantial effort has been given to harnessing life science research for the purposes of national security. Biodefence and bioterrorism preparedness are thus considered high-priority areas for national investment by government agencies and the military alike. An illustrative example of this two-tiered approach is the funding policy in the USA, where biodefence research is financed by the NIH, Department of Homeland Security (DHS) and Defense Advanced Research Projects Agency (DARPA), to name a few.

Under the synergistic influence of these three sets of forces – social, economic and political – biotechnology has been transformed into a truly global fast-evolving enterprise encompassing a multitude of stakeholders, delivering considerable benefits and holding out still greater promise, with profound and far-reaching implications for virtually every aspect of human well-being and social life.

The pharmaceutical industry is a case in point, for its steady expansion would hardly be possible were it not for the vast array of techniques and methods enabled by the progress of the life sciences. Worth roughly 400 billion dollars, the global pharmaceutical market dominates the life sciences industry and arguably determines the trajectory of life sciences-related technological development and global spread.13 Gene cloning, DNA sequencing and recombinant construction of cell lines, to name a few, are all deemed indispensable for the development of novel medicines and therapeutics. It suffices to mention that more than half of the top selling commercially available drugs in the USA would not exist without those methods.14 Agriculture, too, has been heavily influenced by the on-going biotechnology revolution, as evidenced in the rapid growth and dispersion of commercialised transgenic crops (biotech crops) and the efforts to use GMOs (both animals and plants) for the production of vaccine antigens and other biologically active proteins (‘biopharming’).15 Indeed, the increase in the area of farmland planted with transgenic crops rose dramatically from 1.7 hectares in 1996 to about 60 million hectares in 200216 and is still growing. In addition, technological convergence17 between biotechnology, nanotechnology, information technologies and cognitive science has unlocked a broad scope of opportunities for maximising public (and private) welfare, offering substantial benefits in wide-ranging areas such as medicine, pharmacy, crime investigation and national security by ensuring precision and reliability, while at the same time, reducing the amount of time previously required for the performance of certain tasks.

Four key features of biotechnology make it so appealing to the majority of stakeholders involved. First, biotechnology innovation is characterised by duality, whereby research yields results that simultaneously lead to advances in basic knowledge and stimulate product development.18 Second, the output that the life sciences generate in the form of new medicines, improved nutrition products, enhanced yields and novel materials, is ‘strongly positive’.19 The increasing utility of tools and strategies for human enhancement, whether in professional sport, for cosmetic and aesthetic purposes, or on the battlefield, vividly reflects the firm conviction that the transformative capacity of biotechnology, even at the most fundamental level, is something to be welcomed and vigorously embraced. What is more, biotechnology possesses proven economic viability, as illustrated in the burgeoning industries and new markets it has spurred. Against this backdrop, the high rate of biotechnology expansion is anything but surprising, since

every increment in biological capability pays back the researcher and the researcher’s sponsors in short order. Payback comes in the esteem of peers, in promotions, and in increases in the academic or corporate salaries of the researchers whose work generates knowledge and new therapies. Payback comes in the form of profits for the manufacturers of kits to perform the manipulations, royalties for the writers of the methods manuals profits for the drug industry. Payback comes for the public in the form of new drugs and therapies.20

Fourth, besides being cost-effective, many of the benefits that biotechnology offers are easy to obtain and disseminate. In other words, many of the various prospects for public (and private) betterment are not situated at some distant moment in the future but can be realised immediately, as a result of which pressing problems can be alleviated, if not fully resolved, and substantial revenue can be generated in the short term. Last but not least, while there are some risks and concerns associated with the advancement of biotechnology, few of those are deemed urgent or significant enough to impact on the pace of innovation. As the actual manifestation of such risks is often contingent upon the interplay of a variety of factors, this renders the likelihood of a major crisis unfolding as a result of the progress of biotechnology low. Moreover, there is a genuine belief that any challenges that may arise from the proliferation of novel technologies can either be foreseen or dealt with on a case-by-case basis. Given the enormous potential of biotechnology for addressing societal, economic and environmental challenges, it is unsurprising that most states have readily endorsed scientific and technological innovation and embarked on large-scale generously-funded R&D programmes in the life sciences.

Trends in Biotechnology Governance

Given the powerful multifaceted impetus for biotechnology advancement, it is possible to identify at least five key trends in the governance of biotechnology that are common for highly industrialised and developing countries alike. Those include: high-level coordination, facilitation and funding; synergies within and between both the public and private sector; emphasis on strategic and competitive interests at the expense of precaution; regulations that seek to promote rather than restrict scientific and technological progress; and overreliance on technical solutions.

High-Level Coordination, Facilitation and Funding

At international level, the on-going expansion of biotechnology has been hailed not only as an inherently positive development but also as an essential prerequisite for enhancing human welfare and addressing various socio-economic, environmental and health concerns. In its 2013 World Health Report, the WHO called for:

Increased international and national investment and support in [life science] research aimed specifically at improving coverage of health services within and between countries.21

The WHO has also strived to promote research on specific diseases, such as HIV/AIDS, cancer, pandemic influenza, tuberculosis and malaria, with the goal to improve methods for prevention and diagnostics and facilitate the development of effective therapeutics and vaccines.22

In a similar fashion, the UN Food and Agriculture Organisation (FAO) has highlighted the positive impact that biotechnology could have on the development of agriculture:

…biotechnology could be a major tool in the fight against hunger and poverty, especially in developing countries. Because it may deliver solutions where conventional breeding approaches have failed, it could greatly assist the development of crop varieties able to thrive in the difficult environments where many of the world’s poor live and farm.23

It is not difficult to see how those assertions have been translated into national policies and practical steps across the globe. The US NIH that provide the bulk financial support for medical and health-oriented R&D in the US spent over 30.9 billion dollars during the fiscal year 2012, about a third of which was allocated for funding biotechnology and bioengineering projects.24 Within its Sixth Framework Programme for Research and Technological Development spanning the period 2002–2006 the European Union (EU) distributed more than 2.5 billion euro for projects under the theme ‘Life Sciences, Genomics and Biotechnology for Health’.25 Developing countries, too, are increasingly investing in ‘red’ biotechnology as part of their efforts to address public health concerns. According to a recent WHO report, support for biotechnology and particularly, for cancer research, in Cuba has soared over the past 20 years, amounting to over one billion dollars.26 As a result, the Cuban biotechnology industry is burgeoning, holding around 1200 international patents and exporting vaccines and pharmaceuticals to more than 50 countries.

The prospect of climate change coupled with rising population numbers has compelled governments in the global North and South alike to explore ‘green’ biotechnology as a means of ensuring food security. The USA remains by far the largest commercial producer of GM crops. Several EU member states (France, Germany, Spain, Poland, Romania, Czech Republic, Portugal and Slovakia), Canada and Australia further feature in the list of industrialised nations that have embarked on growing GM plant breeds. More and more emerging economies are striving to expand their agrobiotechnology sector, most notably Brazil, India, Argentina, South Africa, Mexico, Burkina Faso, Myanmar and Chile.27 In 2008, the Chinese government launched a major R&D initiative worth 4 billion dollars to develop new plant varieties by 2020 that will enhance yields, have improved nutritional value and be resistant to pests.28

Synergies Within and Between Private and Public Sectors

Public-private partnerships underpinned by access to early-stage risk capital and strong linkages between business, universities and entrepreneurial support networks constitute an important vehicle for promoting innovation and fostering technology transfer and product development. For instance, the Chinese government has launched a major initiative mobilising 2.5 billion dollars in venture capital to support start-ups in the immense Zhangjiang science park outside Shanghai29; Russia’s Rusnano has entered a 760 million dollar partnership with the US venture capital firm Domain Associates to fund ‘emerging life science technology companies and establish manufacturing facilities in Russia for production of advanced therapeutic products’; and Cleveland’s University Hospital has allocated 250 million dollars for setting up a ‘non-profit entity to fund and advise physician-scientists on transitional research and a related for-profit accelerator that will develop selected compounds to proof of concept.’30 The Kauffman Foundation in the USA, a wealthy philanthropic establishment dedicated exclusively to the goal of entrepreneurship has been particularly zealous in its quest for promoting university-based entrepreneurial activities nationwide. Its Kauffman Campuses Initiative launched in early 2003 enjoyed so much popularity among universities that following the initial round of grants totalling 25 million dollars, the Foundation announced its resolve to leverage a 100 million dollar investment for the creation of new interdisciplinary education programmes.31

University-industry partnerships, while not a novel phenomenon in the area of biotechnology, have considerably intensified over the past several decades, thus facilitating the widespread commercialisation of life science research. Indeed, 90 per cent of the companies in the US surveyed by Blumenthal et al. in 1996 had relationships with an academic institution in that year and in more than half of those cases industry provided financial support for research in such institutions.32 According to another study, the total industry investment in academic life science research in the USA tripled between 1985 and 1998 reaching almost 2 billion dollars and has been growing ever since.33 Against this backdrop, some commentators have put forward the ‘Triple Helix’ model, which serves both as a conceptual tool and a policy blueprint. In the former case, it is used to elucidate the academic-industry-government relationships that underpin the institutional arrangements and changing practices in the processes of production, transfer and application of knowledge in post-industrial societies; in the latter, it is promoted as a framework for economic development through state investment and knowledge sharing between academia and industry.34

Others, however, have remained sceptical of the close integration of universities and the private sector voicing concerns about the possible deleterious effects arising therefrom:

As in other activities, when big money flows fast, temptations and opportunities arise for risky behaviour and stealthy or even brazen wrongdoing in pursuit of personal or institutional advantage. The new world of academic-commercial dealings is characterised by some grey areas and evolving rules for permissible and impermissible conduct. The people who manage and conduct research in scientific organisations are not immune to the weaknesses and foibles so plentiful elsewhere, despite the accolades for probity that science bestows upon itself.35

With more and more universities joining the biotechnology ‘gold rush’ and corporate values and goals steadily penetrating the professional academic cultures, scholarship turns into a result-oriented activity subject to the priorities and interests of business partners and industrial sponsors. Strategy and careful planning deemed essential to the pursuit of for-profit knowledge can have a restraining effect on the spontaneous vigour characteristic of academic research, limiting the range of problems that could be studied to those defined by the market.36 At the same time, scientists often find themselves under tremendous pressure striving to satisfy the demands of their industrial clients without utterly neglecting their academic duties ranging from mentorship through filing grant applications to publishing. The extensive workload coupled with the bright prospects for securing long-term research funding and achieving some individual gain and prominence provide a favourable environment in which instances of dubious, sometimes fraudulent, behaviour, conflicts of interest and lack of transparency, unless too severe, are unlikely to encounter widespread opprobrium and may even go unnoticed.37 In the race for patents and venture capital, the business mentality dulls scientific rigour and the ethics threshold appears not too difficult to cross.

Governments Tend to Favour Strategic, Political and Economic Interests at the Expense of Precaution

Given the tremendous benefits that biotechnology is expected to generate in virtually any sphere of human activity, it is not difficult to understand why its progress is predominantly viewed through an explicitly positive lens by policy-makers. Since the opportunities for achieving public betterment and enhancing state prestige and international standing are too tempting and too abundant, there is a powerful urge to dedicate both will and resources to promoting the large-scale expansion of the life sciences. For one thing, the prospect of conquering disease and maximising human wellbeing provides solid justification for a deliberate and sustained investment in fostering scientific and technological prowess. Lack of commitment and reluctance to support R&D in the life sciences then becomes an unfavourable option in the political calculations of states regardless of their level of economic development and international status. Within the context of political calculus pervaded by realist fears, competition and power, the perceived risks of inaction with regard to scientific and technological development justify vast expenditure, lower regulatory barriers to innovation and product development. Political choices concerning biotechnology support are therefore frequently made at the expense of calls for caution and potential social, environmental and ethical concerns.

The regulation of genetic engineering is a case in point. As discussed in the previous chapter, from the outset, the attempts of governments to impose strict controls on research involving rDNA faced a severe backlash from academic scientists and business executives alike. By the 1980s, the various legislative initiatives put forward in the USA were abandoned in favour of the regime established by the NIH Guidelines, which virtually exempted the biotechnology industry from formal regulation. While the leading US-based companies pledged to ‘voluntarily comply’ with the Guidelines, behind the scenes they craftily continued to push for a system that would insulate them from governmental and public scrutiny.38 Indeed, during the 1990s when the States Parties to the BTWC strived to strengthen the treaty by negotiating a binding verification mechanism corporate interests proved too big and too important to be ignored. Both the Pharmaceutical Research and Manufacturers of America (PhRMA) which represented the country’s major research-based pharmaceutical and biotechnology companies, and the Biotechnology Industry Organisation (BIO) which at that time represented some 1400 biotechnology firms, became vocal opponents of any measures designed to promote international arms control which seemed to hinder in any way the protection of proprietary information and intellectual property.39 In the period between 1994 and 2001 the associations invested considerable effort, time and ingenuity in lobbying the US government and influencing the diplomatic talks in Geneva to secure an outcome that was in line with the demands of their constituencies. Of course, it would be naive to ascribe the US resolve to reject in 2001 both the text of the Protocol and its utility in general for providing adequate verification and enhancing confidence among States Parties solely to the activity of the biotechnology industry; nevertheless, it would be equally naive to suppose that corporate interests played no significant role in the process.40

Besides economic priorities, national security and military calculations can also provide a compelling rationale for downplaying the potential risks associated with biotechnology expansion. Following the ‘Anthrax letters’ attack in October 2001, the US government embarked on a massive financial investment to boost its bioterrorism preparedness and enable the prevention, early detection, monitoring and emergency response to biological threats. As outlined in Biodefense for the 21st Century, a presidential directive that set out a comprehensive framework for national biodefence policy, between 2001 and 2005 the federal government provided roughly 6 billion dollars ‘to state and local health systems to bolster their ability to respond to bioterrorism and major public health crises’.41 Along with the highly controversial vaccination programme that the government envisaged,42 another important development designed to enhance America’s biodefence preparedness and capability was the drastic increase both in the number of high-containment labs (BSL-3 and BSL-4) and the number of researchers with access to some of the most dangerous pathogens known to mankind, including the causative agents of Ebola, plague and Q fever. Some commentators have questioned the logic behind this policy highlighting the heightened risk of accidental or deliberate release of pathogens.43 Far from being ill-founded or hypothetical, such fears stemmed from a range of high-profile cases that occurred after 2001 across the US in which the lack of proper training and professional negligence resulted in scientists being exposed to or infected with deadly microbes.44 Real-life horror stories about vials of plague being transported in the hand-luggage of researchers on passenger aircraft without the required authorisation, and deadly cultures gone missing from what appeared to be secure laboratories further fuelled the criticism toward the US biodefence policy raising difficult questions about its appropriateness and actual goals even before the ‘Anthrax letter’ investigation revealed that the attack was ‘insider’s business’.45

Biotechnology Regulations Seek to Promote Rather Than Restrict Technological Advancement

Life science research, just as any other sphere of professional activity, is subject to a range of institutional, national and international regulations. Along with the more general rules such as those related to occupational health and safety, fair pay and job competition, conflict of interests, labour rights, and professional liability, there are also specific ones addressing particular aspects of the research process including project clearing (e.g. review by local biosafety committees), safe laboratory practice and transport of pathogens (e.g. 2005 International Health Regulations), exchange of viral strains (e.g. Pandemic Influenza Preparedness Framework, 2011), handling of dangerous pathogens (e.g. US Select Agent Programme) and ethical treatment of human subjects and samples obtained therefrom (e.g. The 2004 Human Tissue Act in the UK).46 While hardly exhaustive, this list suffices to convey the idea that the regulatory regime governing the practice of life science research is dense and comprehensive. With more than 30 international organisations overseeing biotechnology from various perspectives,47 there is a prima facie reason to assume that the regime in its current form is sufficiently flexible to accommodate novel advances and hold any potential risks, which they may pose, at bay. Yet in reality over the past decade the opposite trend has prevailed, that is, the existing governance mechanisms have struggled to respond adequately to the proliferation of new scientific developments with multiple adaptive uses and the multiplicity of cutting-edge developments posing profound ethical quandaries. How to account for this discrepancy?

Part of the problem stems from the fact that since at least the late 1970s the regulation of biotechnology has been streamlined so as to become compatible with and not a restriction on continued technological change and economic growth.48 As such, it rests upon the barely questioned assumption that the progress of biotechnology is inherently good and needs to be harnessed and vigorously promoted. Needless to say, any measures that seem to slow down or restrain its advancement are deemed undesirable and even detrimental to socio-economic development. Hence, when developing regulations, policy-makers have generally pursued a two-fold objective: first, to promote the safe practice of life science research by reducing any risks arising therefrom both to scientists and the general public; and second, to ensure that any issues that may hinder the expansion of biotechnology are not subject to restrictive legislation.

A vivid manifestation of this approach is the way in which the ongoing debate on ‘dual use research of concern’ – benignly-intended research that seeks to maximise human welfare by responding to health, societal and environmental ills but could also facilitate to the development of more sophisticated and potent biological weapons and enable bioterrorism49 – has been handled. For more than a decade, researchers, journal editors, security experts and policy-makers have strived to devise oversight mechanisms and governance initiatives that could adequately tackle the challenge of dual use without stifling innovation. Unfortunately, to date their efforts have met with little success, as a result of which virtually each experiment of dual use concern is dealt with separately on a case-by-case basis. This is not to say that there are no similarities across the studies of this kind. On the contrary, a few of the most notable examples follow a similar paradigm, including the creation of a vaccine-resistant strain of the Mousepox virus, the artificial synthesis of the Polio virus, the recreation of the 1918 Spanish Influenza virus and, most recently, the production of a mammalian-transmissible H5N1 Avian Influenza virus (see Fig. 4.1).50 All four of them were performed in strict compliance with the rules and procedures in place for laboratory biosafety, biosecurity and biorisk management and under appropriate physical containment conditions; all had passed thorough review by the respective local biosafety and bioethics committees; and all of them were deemed essential in terms of public health benefits. Above all, the ethical and security concerns that the studies have raised go far beyond the laboratory door, posing fundamental questions about how life science research is reviewed, conducted and communicated.
Fig. 4.1

Examples of Experiments of Concern

Yet none of the high-profile experiments of concern has proved critical enough to provoke a radical change in the way dual-use research is governed.51 Three points merit consideration in this regard. The first pertains to the manner in which the dominant discourse on dual use is framed, that is, in purely ethical terms as a dilemma. While bioethics undoubtedly has a role to play in the discussions on dual use, the language of ‘dual-use dilemmas’ is too abstract to offer appropriate analytical tools for dealing with the issues at play. As discussed above, the questions that dual-use research poses such as data sharing, research funding and project planning are far from hypothetical but they feature explicitly in everyday professional practice. However, the ‘dilemma framework’ automatically strips them of the complex socio-technical arenas in which they have actually presented themselves by laying an emphasis on what action should ideally be taken, rather than what is practically feasible given the circumstances.52 Moreover, such issues are typically structural in nature for they constitute fundamental elements of the life sciences professional culture, and as such, could hardly be adequately addressed solely at the level of individual researchers. Yet framing social, legal and security concerns in terms of moral dilemmas allows for structural issues to be omitted from the discussion, rendering life scientists the chief, if not the only, moral agents expected to reach what is deemed to be the ‘right’ answer.53 Assigning abstract duties then comes to be regarded as an appropriate ‘solution’, even if those are virtually impossible to fulfil given the complexities of the working environment within which researchers operate.

The second point is related to the reductionist view that dominates the discourse of what counts as a risk in life science research. Perhaps one of the most significant legacies of the Asilomar Conference on rDNA (see  Chapter 3) is the emphasis on laboratory risk that could be effectively managed by dint of physical containment and rules and procedures for safe laboratory practice.54 It suffices to mention that the bulk of guidelines and formal regulations published by the WHO focus exclusively on promoting and refining measures that aim to maximise laboratory biosafety and prevent the accidental release of pathogens. Hence, it is hardly surprising that the concept of dual use and the idea of risks beyond the laboratory door implicit in it seem alien to the majority of practising researchers. Striking as it may appear, even though dual use research has been debated for more than a decade now, the level of awareness among life scientists of the broader social, security and legal implications of their work remains low.55

The third point deals with the way in which risks in life science research are assessed and mitigated. Given the narrow definition of risk encompassing technical particulars, physical containment and biosafety, risk assessment is considered an appropriate and reliable tool for ensuring research safety. The heavy reliance upon risk assessing tools is underpinned by two underlying assumptions. One is that it is possible to foresee and calculate most, if not all, things that could potentially go wrong both during the development phase of the project and after its completion. The other is that it is possible then to use the produced data as a basis for devising measures and strategies for eradicating, or at least, mitigating the risks likely to occur. Attractive as it may seem, this ‘new alchemy where body counting replaces social and cultural values’ presupposes a clear distinction between the risk assessment ‘experts’ and the general public, whereby the former are granted a licence to make decisions about the risks that the latter cannot do without.56 Likewise, cost-benefit analysis on the basis of which research proposals are screened for potentials risks and security concerns has attracted some serious criticism. In the view of some commentators, besides being sometimes deeply inaccurate, the cost-benefit analysis is ‘ethically wrong’ since ‘applying narrow quantitative criteria to human health and human life’ is unacceptable.57 But there are other problems, too. As pointed out by Dickson, the cost-benefit analysis distorts political decision-making by omitting any factors that cannot be quantified, thus obscuring questions of equity, justice, power, and social welfare behind a technocratic haze of numbers.58 As a result, complex and politically charged decisions are reduced to a form that fits neatly into the technocratic ways of making regulatory decisions, whereby calculations and approximations made by the few substitute for the judgements of many.59

The wide-ranging controversy that unraveled in late 2011 when two teams of scientists working independently in the Netherlands and the USA managed to produce an air-borne strain of the H5N1 Avian Influenza virus, a highly pathogenic and lethal microbe with over 60 per cent mortality rate in humans arguably constituted the pinnacle of the deliberation on dual use research. Both studies set alarm bells ringing for the security community who almost immediately jumped in the debate voicing concerns over the possibility of biological proliferation and bioterrorism. Some commentators even argued that the experiments ran counter to the spirit if not to the letter of the 1975 BTWC.60 Against this backdrop, the resultant controversy was deemed at least initially to offer a timely opportunity to evaluate the existing governance mechanisms, determine their gaps and weaknesses, and broaden the scope of deliberation inviting participation of a wide range of stakeholders. Unfortunately, the outcome of the debate proved far more moderate, signalling preference for preserving the status quo without disrupting the established systems for governance and oversight. Despite the extensive mass media coverage of the controversy, only few public consultations were held and none of those was designed as a platform for making policy proposals or developing action plans. Moreover, the densely-packed agenda prepared duly in advance left very limited scope for posing ‘tricky’ questions which the participating ‘experts’ might have struggled to answer. Needless to say, all consequential decisions were made behind closed doors away from public scrutiny and on some occasions the people with the greatest vested interest in the publication of the studies were also the ones with the greatest say in the process.61 There were no significant changes in terms of governance initiatives, either. Far from being ground-breaking developments, the US Government Policy for Oversight of Life Sciences Dual Use Research of Concern62 and the decision of the Dutch government to invoke export control legislation before allowing the publication of the study conducted within its jurisdiction were little more than desperate moves that aimed to obscure the inadequacy and shortcomings of the measures already in place.63 Overall, the manner in which the H5N1 debate was handled could be treated as a missed opportunity, whereby those in charge of the decision-making process did little to address or even acknowledge the broader issues underpinning dual-use research of concern but simply ‘kicked the can down the road to the next manuscript’ waiting for the next controversy to erupt.64

Reliance on Technical Fixes

Technology seems to play a significant role in the governance of life science research. High-containment laboratories, well-equipped biosafety cabinets, sophisticated waste management systems, enhanced personal protective equipment and secure containers for the safe storage and transportation of biohazard materials are just a few of the tools and systems in place that allow the safe handling of dangerous pathogens and toxins and, at the same time, protect both laboratory personnel and the general public from exposure to deadly microbes. That said, the effectiveness of technical solutions should not be overstated if only for the fact that ‘problems’ of governance are barely technical matters per se but rather constitute complex issues of human relatedness. Nevertheless, the attractiveness of technological fixes as offering reliable risk mitigation and reassurance in the safety of biotechnology is ever growing. It suffices to mention that the H5N1 controversy discussed above was in part resolved after the lead researchers in the Netherlands and the USA respectively agreed to add a detailed section on the technical specificities and laboratory biosafety and biosecurity measures taken during the experiments.65 The strategy has proven effective in diverting attention from the rather inconvenient questions regarding the utility and significant potential for hostile misuse of the so called ‘gain-of-function’ (GOF) research and concentrating it on more mundane issues dealing with in-house precautions and safety procedures. Once the latter were deemed adequately resolved, the former were effectively forgotten.

Still, the value of technical means in ensuring reliable risk management should not be taken for granted. For one thing, laboratory biosafety precautions, however sophisticated, are far from perfect and accidents do occur. Such is the case with the Pirbright site in the UK which was at the centre of a major outbreak of foot-and-mouth disease in 2007, as a result of which over 2100 animals were slaughtered.66 In 2012 the bioterrorism BSL-3 laboratory at the US CDC in Atlanta suffered repeated problems with airflow systems designed to help prevent the release of infectious agents.67 The faulty system could perhaps be regarded as an exception had it not been for the authoritative investigation report of the US Government Accountability Office (GAO) released in March 2013. According to the report, the cost of building and maintaining high-containment laboratories, coupled with the absence of national standards for their design, construction, operation, and maintenance ‘exposes the nation to risk’.68 Far more critical is the situation in the developing world and emerging economies where lax regulations and technical failures have significantly heightened the risk of accidental release of pathogens, as demonstrated by the numerous ‘escapes’ of the Severe Acute Respiratory Syndrome (SARS).69

But even if technology functions impeccably, this hardly reduces the likelihood for a human error or inappropriate behaviour. Unlocked doors in high-containment facilities hosting deadly pathogens, eating and drinking in laboratories and poor waste disposal practices are just a small part of the otherwise long list of mundane mishaps that may result in severe consequences. It is worth mentioning that the US CDC came under the spotlight after internal e-mail correspondence revealed that doors in the BSL-3 block where experiments involving the causative agenets of anthrax, SARS and influenza were performed were left unlocked on numerous occasions, thus increasing the risk of unauthorised access or theft.70 Given the chance of technical flaw and the potential for human error, some life scientists have begun to question the reliability of existing laboratory precautions and demand thorough review and evaluation. In a recent letter to the European Commission the Foundation for Vaccine Research has asked for ‘a rigorous, comprehensive risk-benefit assessment’ of GOF research that ‘could help determine whether the unique risks posed by these sorts of experiments are balanced by unique public health benefits which could not be achieved by alternative, safe scientific approaches’.71

Engines that Drive Biotechnology Momentum

By and large, the ongoing progress of biotechnology is largely viewed and assessed through an explicitly positive lens which allows focusing almost exclusively on the benefits likely to be accrued notwithstanding the risks, actual and potential. The resultant distorted image is problematic, not least because it precludes any comprehensive discussion on the potential side effects and negative implications of novel life science advances. Above all, it sustains the barely questioned assumption that the existing governance mechanisms are adequate and sufficient to cope with the stresses and strains of the rapidly evolving biotechnology landscape. Yet given the complex and multifaceted dynamics shaping the life science enterprise, the rapid pace of innovation, and the limits to predicting the synergistic and cumulative effects of the proliferation of new technologies, the uncritical acceptance of such assumptions is at best naïve and at worst dangerous.

Integration of Biology with Other Disciplines

Arguably the advancement of the life sciences has greatly benefited from the fascinating breakthroughs made in other areas of study, such as chemistry, engineering, computing, informatics, robotics, mathematics and physics. Some commentators even talk about a Third Revolution in Biotechnology underpinned by scientific and technological convergence:

Convergence does not simply involve a transfer of tools sets from one science to another; fundamentally different conceptual approaches from physical science and engineering are imported into biological research, while life science’s understanding of complex evolutionary systems is reciprocally influencing physical science and engineering. Convergence is the result of true intellectual cross-pollination.72

The resultant ‘New Biology’ has opened up a range of marvellous possibilities enabling the manipulation of living matter at the full range of scales, as well as the application of biological systems principles for the development of novel materials, processes and devices.73 As such, it has been largely hailed as possessing the ‘capacity to tackle a broad range of scientific and societal problems.’74 This is not an exaggeration. As noted by a recent report of the US NAS, the precipitous decline in the cost of genome sequencing would not have been possible without a combination of engineering of equipment, robotics for automation, and chemistry and biochemistry to make the sequencing accurate.75 Likewise, it is the combination of expertise from fields as diverse as evolutionary biology, computer science, mathematics, and statistics that has allowed both the analysis of raw genomic data and the subsequent use of these data to other fields.76 At the same time, advances in nanoscience and nanotechnology have considerably enhanced drug delivery making it more accurate by targeting specific parts of the body.77

Yet the transformative potential of scientific and technological convergence comes at a price, not least because parallel to the benefits it offers, there are risks the effects of which could be truly devastating.78 Take drug delivery, for instance. Thanks to the technological breakthroughs over the past decade, doctors have gained unprecedented access to the human body which, in turn, has facilitated the treatment of previously incurable disease and conditions (e.g. some forms of cancer). Nanoparticles and aerosols are now utilised for delivering a precise dose of therapeutics to tissues and cells via novel pathways circumventing body’s natural defences and evading immune response. It is not difficult to imagine how such knowledge could be misapplied for malicious ends, including incapacitating and killing. Research on bioregulators is a case in point. Bioregulators are natural chemicals in the human body that play a vital role in the maintenance of the homeostasis but when administered in large quantities or in healthy individuals could be toxic and lead to serious disorders, even death. Given their properties, bioregulators constitute the perfect bioweapon: efficient and virtually impossible to detect. And if in the past, security analysts discounted the risk of their weaponisation due to the instability of the compounds when released in the atmosphere, the emergence of novel drug delivery techniques has significantly altered the security calculus.79 This is just but one example of the challenges that the increasing convergence between biology and chemistry poses to the integrity of the international biological and chemical non-proliferation regimes.80 Even though some effort has been made over the recent years to address those and other areas of concern and strengthen the international prohibition against biological and chemical warfare, in practical terms little has been achieved, as a result of which the risk of the hostile exploitation of novel scientific developments remains far from hypothetical.

Along with the risk of misuse of new knowledge, there is the risk posed by the lack of sufficient scientific knowledge. Cross-disciplinary convergence opens a multitude of opportunities for manipulation and modification of living matter but, at the same time, it precludes almost any sensible assessment of the potential interactions likely to occur in the process. Nano-based medicine is but one area that has attracted criticism in this regard. Since some elements behave differently at nano-scale, it becomes extremely difficult to assess their level of toxicity or other negative side effects that they may exert. Such is the case with long carbon nanotubes, which having been initially praised for their potential to improve implant development81 were later blamed for exhibiting asbestos-like behaviour that could lead to cancer.82

Another area of converging science with far-reaching implications is synthetic biology, a cross-disciplinary field that draws upon strategies and techniques from molecular biology, chemistry, engineering, genomics, and nanotechnology and thus enables the design and modification of biological systems at a fundamental level. Empowered by the tools of synthetic biology, in 2002 scientists managed to assemble a polio virus ‘from scratch’ in the absence of a natural template. And in 2010 Craig Venter and his team announced the construction of the first self-replicating synthetic cell which, in their view, was ‘a proof of the principle that genomes can be designed in the computer, chemically made in the laboratory and transplanted into a recipient cell to produce a new self-replicating cell controlled only by the synthetic genome.’83 The controversial work has attracted criticism on several grounds, including the potential negative effects of the accidental or deliberate release of the novel organism in the environment and the arrogance of scientists to ‘play God’.84 More broadly, both the polio and synthetic cell studies have exposed the obstacles to the regulation of synthetic biology.85 While some commentators dismiss the risk of bioterrorism, underscoring the key role of tacit skills and knowledge and the difficulties that the lack thereof poses to the replication of the experiments,86 other issues still merit attention. Consider the question of access to commercially available genomic sequences. Even though the oversight system for screening base pair orders has improved since the 2006 Guardian report that exposed the lax regulations under which virtually anyone could order gene sequences,87 gaps still remain leaving scope for abuse by those with malign intent. For example, Schmidt and Giersch have outlined at least three areas of emerging challenges that the existing governance regimes would struggle to accommodate, including ‘split orders’, ‘outsourcing’, and the potential for non-natural biological systems.88

Biology as a Predictive rather than Descriptive Science

The Human Genome Project completed in 2003 lasted over ten years and cost close to 3 billion dollars; by contrast, about a decade later, whole-genome sequencing can be performed within hours at a price of roughly 1000 dollars or less.89 While still in its infancy, personalised medicine and individual genetic testing are steadily gaining popularity. Indeed, ‘up to 100 000 people in England are expected to have their entire genetic makeup mapped in the first stage of an ambitious public health programme’ launched by the National Health Service in 2012 that aims to ‘revolutionise the treatment and prevention of cancer and other disease.’90 According to its proponents, genomic testing offers numerous advantages vis-à-vis traditional evidence-based medicine, including the possibility of early diagnostics of disease, of individually-tailored treatment and, perhaps most importantly, of disease prevention, as illustrated in the resolve of the Hollywood actress Angelina Jolie to undergo double mastectomy after discovering she has an inherited genetic mutation that puts her at high risk of breast and ovarian cancer.91 But this is just the beginning. In 2012 scientists managed to sequence a foetus’s entire genome using a blood sample from the mother and a saliva specimen from the father, a development that could potentially allow for a range of genetic disease conditions to be detected prenatally.92 And laboratory experiments have already demonstrated the efficacy of genetic therapy to cure mitochondrial disease by creating an embryo with genetic material from both parents and a third person acting as a donor.93

While truly breathtaking, the advances outlined above raise a host of thorny issues of ethical, social, and legal concern that merit public scrutiny and extensive deliberation before decisions regarding their widespread application are made. At a very basic level, there is the question of whether and to what extent we as individuals are capable of assimilating the information that our own genetic makeup may reveal. Are we sufficiently resilient to cope with the emotional distress, anxiety, shame, stigma and guilt that the awareness of severe medical conditions that we or our closed ones are suffering or likely to develop? Far from hypothetical, this question has prompted the establishment of a novel profession, that of the genetics counsellor whose task is to help patients overcome any negative effects, stress, or psychological trauma that the disclosure of their genomic map may create.94 This is just a partial solution though, for the crux of the matter lies in finding a way to deal effectively with risk and probabilities and we as humans are yet to demonstrate a capacity for understanding or relating them to our own lives.95

Individual emotional turmoil, however significant, constitutes only the tip of the iceberg. According to Daniel Kevles, the torrent of new genetic information has already begun to fundamentally reconfigure social practices and inter-personal relations:

It has been rightly emphasised that employers and medical or life insurers may seek to learn the genetic profiles of, respectively, prospective employees or clients. Employers might wish to identify workers likely to contract disorders that allegedly affect job performance while both employers and insurers might wish to identify people likely to fall victim to diseases that result in costly medical or disability payouts. Whatever the purpose, such genetic identification would brand people with what an American union official has called a life-long ‘genetic scarlet letter’ or what some Europeans term a ‘genetic passport’.96

Linking genetic makeup with human identity would ultimately set the scene for the proliferation of technologies aimed at human enhancement: after all, if a gene therapy could allow one to stand a chance in a job competition, boosting one’s capabilities would potentially make them a more desirable candidate. Other issues of more immediate concern are also likely to arise. One is privacy. Gene-sequencing companies usually hold the genetic data of their clients in digital format on online platforms, which automatically creates a risk that personal information may be leaked, hacked or stolen.97 Further, there is the question of ownership. Consider, for instance, the controversial issue of human gene patenting, whereby patented genes are treated as research tools and, as such, are controlled by the patent holder who may restrict and charge for their use.98 Thus created, the system often operates to the detriment of patients by hindering research practice, elevating diagnostics prices and denying access to second and independent medical opinion.99 Gene identification alone has a potential ‘dark side’ too, for it could enable the development of weapons targeted at group-specific gene markers (e.g. ethnicity).100

Pre-natal genetic testing is yet another significant bone of contention, not least because it evokes notions of state-mandated eugenic programmes and assaults on human rights and dignity. While a Nazi-like campaign for a superior race seems improbable in the twentieth-first century, this is not to say that other forms of eugenics may not be encouraged. Indeed, some commentators have highlighted the rise of ‘homemade eugenics’,101 whereby individual families can make decisions on the attributes of their progeny:

The lure of biologically improving the human race, having tantalised brilliant scientists in the past, could equally seduce them in the future, even though the expression of the imperatives may differ in language and sophistication. Objective, socially unprejudiced knowledge is not ipso facto inconsistent with eugenic goals of some type. Such knowledge may, indeed, assist in seeking them, especially in the consumer-oriented, commercially driven enterprise of contemporary biomedicine.102

It is plausible to assume that when presented with the opportunity of having their future child tested for genetic disorders, many parents would barely hesitate to accept. Such a resolve could have far-reaching implications though. For instance, some genetic therapies entail the use of donor DNA different from that of the parents, whereby any genetic modifications in the embryo will pass down to future generations.103 Despite the government support for the ‘three-parent babies’ in the UK, local religious organisations have protested vociferously against the legalisation of the technique.104 At the same time, there are certain genetic disorders that can be diagnosed at an early stage but, as of yet, cannot be cured, which inevitably poses the tough choice between raising an unhealthy child and abortion. To be sure, such questions constitute more than individual parents’ dilemmas, for they touch upon established social and cultural values, something evident in the profound differences across national reproductive policies. More broadly, there are concerns that reproductive genomics may remain a prerogative of those affluent enough to afford it, thus further exacerbating the divide between the global rich and the global poor.105

Diffusion of Life Science Expertise: International Collaboration, De-Skilling and Amateur Biology

The growth of life science capacity over the past few decades across the globe has been truly astonishing, leading to the emergence of a vibrant research community that brings together researchers from various parts of the world. Indeed, a 2011 NAS report highlights the extension of both North-South and South-South partnerships, which has played a key role in synergising strengths and maximising competitiveness by improving the quality and effectiveness of research and facilitating data sharing.106 At the same time, increasing collaboration in the realm of biotechnology industry has offered companies situated in emerging economies access to the global market, thus contributing to economic development and growth.107

Recent advances in technology and laboratory and experimental equipment have further impacted on the practice of life science research in profound ways. Improvements in DNA sequencing technology have significantly shortened the time required for the preparation of nucleic base-lines, thus relieving scientists of the burden of completing the task themselves and allowing them to focus on their actual project instead. Studies and experiments once performed by senior researchers with extensive experience are now carried out by Masters students. Aided by specially designed genetic engineering toolkits, children as young as the age of ten start exploring the realm of biology in an interactive and engaging manner. Needless to say, their notion of science and the world in general would differ significantly from that of their parents whose primary sources of knowledge used to be textbooks and encyclopaedias. Indeed, the increasing commercialisation of synthetic biology offers anyone curious enough to fiddle with biological systems the chance of doing so in the comfort of their own home.108 Such modern gene hackers often lack formal background in biology and come from various walks of life. Driven by an insatiable appetite for knowledge and the vision of a ground-breaking discovery that could be turned into a multi-million dollar profit, they take up the rather unusual hobby of biohacking which entails the redesign of existing and the creation of novel biological systems. For just few hundred dollars bio enthusiasts set up laboratories easily obtaining all essential requisites and equipment from online sales. And if to some biohacking equates to little more than an unusual hobby, others highlight its potential to generate substantial revenue and fuel economic development.109

Contrary to popular expectation, biohackers are not just eccentric individuals who work in solitude away from public attention. Rather, they are members of a wide global movement dedicated to the ideal of Do-It-Yourself Biology (DIY), which has branches in 45 locations on four continents.110 The movement has been partially institutionalised through the establishment of the BioBricks111 and International Genetically Engineered Machine (iGEM) Foundations, which seek to promote the open and ethical conduct of biological engineering and stimulate innovation and creativity. To this end, iGEM holds an annual competition open to high school students, university undergraduates and entrepreneurs from all over the world. With more than 200 participating teams, the competition constitutes the premiere forum at which biohackers can showcase their skills through project presentation.

Exciting as it may seem, the ongoing diffusion of life science expertise poses an array of governance conundrums. At the level of professional practice, the proliferation of research facilities around the world has exposed the urgent need for laboratory biosafety and biosecurity training, especially in developing states where a tradition of handling dangerous pathogens is lacking. The issue is further complicated, for such countries often lack the required legal and institutional infrastructure to ensure that professional practice is in compliance with relevant international regulations. Foreign aid has gone some way in helping overcome those deficiencies but it has given rise to new problems, too. For instance, it is far from unlikely for a donor state to provide material support for the construction of a state-of-the-art laboratory eventually leaving its maintenance to the local government, which can hardly afford the subsequent costs. A similar trend is observed in the area of capacity building and human resource development. Most projects that aim to promote biorisk management and a biological security culture tend to be severely constrained in terms of time and funding and overly ambitious in terms of agenda and expected outcomes. Lack of adequate mechanisms for quality assessment hinders progress evaluation and sometimes leads to duplication of effort and resources.

The emergence of the DIY biologists in the life science arena has further added to the challenge of ensuring that novel scientific and technological developments are utilised in a safe and ethical manner. Even at the level of everyday practice, difficulties still persist. For instance, many amateur scientists have complained of the lack of manuals and guidelines regarding the safe operation and maintenance of home laboratories. Issues such as waste disposal, safe handling and storage of biological material and prevention of contamination pervade the work of biohackers who unlike professional researchers conduct experiments in a much more volatile environment.112 Potential security concerns are also present. With more and more individuals gaining access to biological engineering technologies, ensuring appropriate oversight of what goes on in garage laboratories becomes increasingly difficult. The experience of the US FBI is a case in point. Back in 2004 the FBI arrested Steven Kurtz, a professor at the University of Buffalo under the suspicion of plotting a bioterrorist attack.113 The subsequent investigation revealed that all laboratory and DNA extraction equipment found in Kurtz’s house was legitimately obtained and used in his artwork. In an attempt to avoid mistakes of this kind, the FBI has drastically changed its approach to dealing with the DIY movement launching a series of outreach activities that seek to raise awareness of the potential security implications of biohacking.114 While undoubtedly necessary, such initiatives may well be seen as too little, too late in light of the wide spread of materials, tools and devices that could facilitate the malign misuse of the life sciences. Indeed, it is worth noting that as early as the late 1990s the US Defence Threat Reduction Agency (DTRA) managed to build a research facility that simulated the manufacture of weaponised anthrax using only commercially available materials and equipment.115

The Role of States: Both a Poacher and Gamekeeper

Structural factors have an important bearing on the development and growth of biotechnology. Economic considerations, power interests and realist fears generate potent dynamics that shape and influence and sometimes direct the life science trajectory. Within this context, states assume a dual role. On the one hand, they are expected to act as gamekeepers and regulate, monitor and control the process of life science research and the dissemination of novel technologies. On the other hand, though, they also have powerful incentives to act as ‘poachers’, not least because of the fascinating opportunities for enhancing their prosperity, prestige and security that scientific and technological development open up.116 The following passage effectively outlines states’ dual function:

Government has an important role in setting long-term priorities and in making sure a national environment exists in which beneficial innovations will be developed. There must be a free and rational debate about the ethical and social aspects of potential uses of technology, and government must provide an arena for these debates that is most conducive to results that benefit humans. At the same time, government must ensure economic conditions that facilitate the rapid invention and deployment of beneficial technologies, thereby encouraging entrepreneurs and venture capitalists to promote innovation.117

Given that the agent (i.e. state governments) in charge of initiating ethical debates on the progress of biotechnology is also the one expected to provide the conditions that would allow this progress to generate outcomes likely to contribute to economic growth and political superiority, it is hardly surprising that any issues likely to slow down or otherwise hinder the enormous momentum of the life sciences are omitted from public discussion. This duality further informs how risks are perceived, framed and addressed. For instance, even though most of the developing countries lack capacity to manage dual use research of concern, they do not see this as an immediate priority and prefer to invest effort and resources in improving their laboratory biosafety and laboratory biosecurity infrastructure and capacity.118 In the view of their governments, the dangers of naturally occurring and circulating diseases constitute a far greater worry than the potential for misuse of cutting-edge research. By contrast, some developed countries, most notably the USA, have embarked on building their biological defence systems highlighting the grave threat posed by the potential use of bioweapons by non-state actors. Their activities have encountered severe opprobrium as some analysts see them as a contravention of the norms embedded in the BTWC.119

The evolution of the chemical and biological non-proliferation regime epitomises the attempts of states to avert the hostile exploitation of the life sciences whilst promoting their use for ‘peaceful, prophylactic and protective purposes’. The entry into force of the BTWC and the Chemical Weapons Convention (CWC) in 1975 and 1997, respectively, is indicative both of states’ renunciation of chemical, biological, and toxin weapons and of their commitment to the goals of arms control and disarmament. That said, the imperfections and shortcomings of these treaties signify the influence of realist fears and political calculations that pervade international negotiations. In the case of the BTWC, two points merit attention. The first pertains to the lack of verification mechanism when the treaty was first agreed back in the early 1970s. Subsequent revelations of secret state-led offensive biological programmes in the former Soviet Union, South Africa and Iraq up until the early 1990s have significantly undermined the Convention. Second, the failure to negotiate a binding protocol in 2001 has further dimmed the prospects for strengthening the regime and thus ensuring universal compliance with its prescriptions. Less acute but just as worrying is the situation regarding the CWC. Even though the Convention is exemplary in many respects, not least because of its verification system, almost universal membership and implementing body – Organisation for the Prohibition of Chemical Weapons (OPCW) – it still faces serious challenges that need to be considered. For instance, while the treaty bans the development, production, acquisition, and retention of chemical weapons, the definition of ‘purposes not prohibited under th[e] Convention’ entails ‘law enforcement including domestic riot control purposes’ (Article II.9d). Some commentators have argued that given the lack of a universally agreed definition what kind of activities count as ‘law enforcement’, this text opens a major loophole in the Convention.120 Several States Parties of the Convention have voiced concerns in this regard. Australia has noted that:

The weaponisation of [Central Nervous System] acting chemicals for law enforcement purposes is of concern to Australia due to the health and safety risks and the possibility of their deliberate misuse, both of which have the potential to undermine the global norm against the use of toxic chemicals for purposes prohibited by the Convention. […] Australia’s position is that it is not possible for a State Party to disseminate anaesthetics, sedatives or analgesics by aerial dispersion in an effective and safe manner for law enforcement purposes.121

Critics highlight the possibility for the deployment of novel chemical weapons for the purposes of countering terrorism, something evident in the 2002 Moscow theatre siege (Dubrovka) when the Russian security forces used a fentanyl-derivative agent, as a result of which about a sixth of the hostages and all of the terrorists involved died.122 In 2011 the European Court of Human Rights ruled with regard to the Dubrovka operation that:

there had been no violation of Article 2 (right to life) of the European Convention on Human Rights concerning the decision to resolve the hostage crisis by force and use of gas.123

The Court, nonetheless, noted that:

Even if the gas had not been a ‘lethal force’ but rather a ‘non-lethal incapacitating weapon’, it had been dangerous and even potentially fatal for a weakened person […].124

The Court further confirmed some of the earlier criticisms that were levelled against the Government, particularly in terms of preparedness and provision of medical assistance.125 According to the ruling, Russia had to pay damages to all the 64 applicants – representatives of siege victims. To date, Russian officials have withheld information concerning the exact formula of the gas, which was used during the Dubrovka operation, on security grounds.126 Given the lack of an internationally agreed definition of what constitutes ‘terrorism’ on the one hand, and the rise of irregular/asymmetric warfare and sporadic conflicts, on the other, some commentators have warned against the possibility of a ‘grey area’ which may enable states to utilise non-traditional methods of war to gain advantage.127

Speed Differential Between Scientific Advancement and the Pace of Deliberative Systems

Deliberative systems encompass a vast array of practices, processes and mechanisms, both formal and informal, whereby a polity considers the ‘acceptability, appropriateness and control of novel developments in or impacting on, shared social and physical arenas’.128 By design, they reflect and are informed by the values, beliefs and standards shared among the group, or in other words, by the prevalent culture. As such, deliberative systems vary across societies with their intensity, inclusiveness and structure depending on the established political and social norms. Yet their chief purpose and function remain virtually the same, namely to help societies adapt to the changing circumstance of their milieu in a way that ensures stability, sustainability and safety.

Public deliberation requires time; and wide-ranging life science advances, current and planned, offer profound challenges to shared ideas and ideals about the foundations of human relatedness and of social coherence, justice, human dignity and many other norms, both formal and informal.129 Yet given the ruminative nature of deliberative processes, on the one hand, and the fast speed at which biotechnology innovation is evolving on the other, the danger of the former being steadily outpaced and overburdened by the latter is far from hypothetical. Consider the following passage sketching the scale of social changes likely to arise from the increasing convergence between nanotechnology, biotechnology, cognitive neuroscience and information technology:

In the foreseeable future, we will be inundated with new inventions, new discoveries, new start-ups, and new entrepreneurs. These will create new goods and new services. […] As expectations change, the process of politics and government will change. People’s lives will be more complex and inevitably overwhelming. Keeping up with the changes that affect them and their loved ones exhausts most people. They focus most of their time and energy on tasks of everyday life. In the future, when they achieve success in their daily tasks, people will turn to the goods and services, the new job and investment opportunities, and the new ideas inherent in the entrepreneurial creativity of the Age of Transitions. No individual and no country will fully understand all of the changes as they occur or be able to adapt to them flawlessly during this time.130

This vision of a ‘brave new world’ merits attention on two important grounds. First, it implies that the changes likely to occur in the not too distant future as a result of the rapid progress of science and technology are imminent and unavoidable in the sense that their advent hardly depends on or even requires extensive public deliberation. Second, given that our capacity for adaptation to and grasp of those changes will be considerably impaired, the Age of Transitions leaves little space for public deliberation. To add to this gloomy picture, there is already some evidence that the progress in the life sciences is overwhelming the existing deliberative mechanisms. For instance, Kelle et al. argue that the rapidity of biotechnology advancement coupled with the immensity and complexity of the knowledge accumulated therefrom complicates efforts to deal with potential risks, something evident in the regulatory gap that the convergence of chemistry and biology has created in the area of arms control.131 This is problematic, for the reduced resilience of deliberative systems provides favourable conditions in which scientific and technological innovation can continue unabated. A vicious circle is thus created in which the inability of deliberative systems to cope with the strain exerted by biotechnology advancement fuels the latter turning it into a self-propelling force. The proliferation of contentious ‘gain-of-function’ research is a case in point. Even though the H5N1 controversy discussed in the preceding sections exposed the limitations of existing governance mechanisms for addressing the potential security, ethical, and legal implications arising from such studies, it hardly precluded scientists from conducting similar experiments. Indeed, less than four months after the moratorium on research involving contagious H5N1 virus was lifted, a team of Chinese researchers announced the creation of a hybrid of the H5N1 strain and the H1N1 virus that caused the 2009 flu pandemic.132 And it was not long until the newly-emerged H7N9 influenza virus became airborne, as well.133 If anything, those examples indicate that in light of the rapid pace of life science progress, addressing governance concerns on a case-by-case basis is not only self-defeating but given the number and variety of conundrums, it is likely to become unsustainable in the long run.

Runaway Biotechnology?

Given the significant potential of biotechnology to bring about multifaceted changes in different spheres of life and generate considerable benefits in the form of new products, enhancement of public and private capital and alleviation of social ills, there is a powerful urge to allow the ongoing expansion of the life sciences to proceed largely unfettered. Risks are carefully calculated and, where possible, downplayed as hypothetical at the expense of comprehensive deliberation. And even when proposals for risk mitigation measures are entertained, preference is usually given to those unlikely to hinder the progress of life sciences. By and large, there is a genuine belief that the existing governance mechanisms in the area of biotechnology can accommodate and cope with the wide-ranging pressures exerted by scientific innovation and the rapid diffusion of technologies with multiple uses, by offering ‘solutions’ and handling concerns on a case-by-case basis. In particular, the technology of safety is still ‘celebrated as an unadulterated improvement for society as a whole’.134

Yet there are reasons for scepticism toward the adequacy and effectiveness of the governance approaches currently in place. Much of the discussion in the preceding sections has focused on the ways in which the increasing pace, growth and global diffusion of biotechnology advances are beginning to expose the limits of the existing measures for control and risk management by challenging accepted values and beliefs and redefining established norms of practice. As the multifaceted dynamics driving the biotechnology momentum continue to intensify and multiply, it becomes more and more difficult to comprehend, let alone foresee, the various impacts that the large-scale deployment and proliferation of novel scientific and technological advances have both on our social systems and the environment. Given the tight coupling between human-made and natural systems and their complex, often unanticipated interactions with catastrophic potential, the existing narrow definitions of risk are rendered inadequate.135 At the same time, the advent of new technologies with multiple adaptive applications opens up an array of possibilities for hostile exploitation thus compelling governments to make tough decisions in an attempt to reconcile the benefits of biotechnology with the potential security concerns arising therefrom. While the advancement of biotechnology promises tremendous public health benefits, it also holds a considerable catastrophic potential, as the case of ‘gain-of-function’ experiments illustrate. As scientific capabilities and work involving dangerous pathogens proliferate globally, so do risks and the prospects of failures, whether technical or arising from human error. Indeed, assessing the rapidly evolving life science landscape some security commentators argue that ‘current genetic engineering technology and the practices of the community that sustains it have definitively displaced the potential threat of biological warfare beyond the risks posed by naturally occurring epidemics’136. Laboratories, however well equipped, do not exist in isolation but are an integral part of a larger ecological system. As such, they constitute a ‘buffer zone’ between the activities carried out inside and the wider environment. And despite being technically advanced and designed to ensure safety, this ‘buffer zone’, just as other safety systems is far from infallible. For one thing, mechanical controls leave room for human error and personal judgement, both of which are factors that could be highly consequential but which could hardly be modelled or predicted with exact certainty.137

The speed at which the transformation of the life sciences is taking place is yet another factor that adds to the complexity of life science governance. Stability is a fundamental condition for the development and preservation of human and natural systems alike. In social systems, culture is the primary source of stability, for it determines what values, beliefs, practices and modes of behaviour are deemed acceptable and, as such, lays the foundations of order. All forms of governance therefore are cultural artefacts and manifestations of culture. Culture also provides the tacit standards whereby change is assessed and treated as acceptable or unacceptable. Hence, any state of affairs in which the rate of change precludes regulation disrupts the ordinary functioning of the system and jeopardises its preservation:

The breakdown of human regulation does not extinguish regulation of a simpler sort. […] The system formed by men and the rest of the natural world will continue to regulate itself after a fashion, even if human regulation wholly fails at all levels above the primary group. But the resulting ‘order’ would exclude all those levels of human order which man-made stability makes possible.138

To be sure, a world characterised by a runaway biotechnology would be far different from the one we know. The main challenge to averting this prospect lies in ensuring that the systems of governance are in sync with the progress of life sciences. History has shown that even highly developed, long-standing systems of governance can fail for reasons as diverse as disasters; loss of authority/legitimacy of governing bodies; and pervasive corruption. One further source of failure includes the inability of a society to adapt to its changing milieu:

Men are adaptable; they can learn to live even in harsh and hostile environments – so long as the environment remains constant enough to give them time to learn. […] If they form the habit of adapting by constantly changing that to which they are trying to adapt, they build uncertainty into the very structure of their lives. They institutionalise cluelessness.139

The process of adaptation is closely connected to cultural patterns and any serious disruptions in the latter could have detrimental effects and impair it severely. The extent to which change is taking place within the framework of the prevalent culture defines the borderline between system evolution and system disintegration. The governance mechanisms currently in place, both formal and informal, are all a function of historical, cultural, and socio-political contingencies. As such, their capacity for adaptation largely depends on our ability to comprehend and assimilate the complex changes that the progress of biotechnology brings about. They can only evolve as fast as our shared standards, values, routines and perceptions allow them to. And that is why governance can hardly be reduced to a technocratic exercise; on the contrary, to be effective, it requires extensive deliberation and full appreciation of the far-reaching implications of novel life science advances.


  1. 1.

    Henrik Noes Piester et al. Trends and Drivers of Change in the Biomedical Healthcare Sector in Europe: Mapping Report (Dublin: European Foundation for the Improvement of Living and Working Conditions, 2007), p. 2. See also Eric Grace, Biotechnology Unzipped: Promises and Realities (Washington, DC: Joseph Henry Press, 1997).

  2. 2.

    See, for example, J Cello et al. ‘Chemical Synthesis of Poliovirus cDNA: Generation of Infectious Virus in the Absence of Natural Template’, Science, vol. 297:5583 (2002), pp. 1016–1018; Eckard Wimmer, ‘The Test-Tube Synthesis of a Chemical Called Poliovirus: The Simple Synthesis of a Virus Has Far-Reaching Societal Implications’, EMBO Reports, vol. 7 (2006), pp. S3–S9; Ian Sample, ‘Craig Venter Creates Synthetic Life Form’, The Guardian, 20 May 2010, available at (last accessed 9/10/2013).

  3. 3.

    Moore’s Law pertains to the rapid rate of technological development and advances in the semiconductor industry, specifically the doubling of the number of transistors on integrated circuits that occurs approximately every 18 months. Although advances in the life sciences occur at more random intervals and are driven by new conceptual breakthroughs in understanding of biological processes, it is a useful metaphor for the exponential growth of knowledge related to biology. See Committee on the Advances in Technology and the Prevention of Their Application to Next Generation Bioterrorism and Biological Warfare Threats, An International Perspective on Advancing Technologies and Strategies for Managing Dual-Use Risks: Report of a Workshop (Washington, DC: National Academies Press, 2005).

  4. 4.

    See, for example, Michael Hopkins et al. ‘The Myth of the Biotech Revolution: An Assessment of Technological, Clinical and Organisational Change’, Research Policy, vol. 36 (2007), pp. 566–589; Paul Nightingale and Paul Martin, ‘The Myth of the Biotech Revolution’, Trends in Biotechnology, vol. 22:11 (2004), pp. 564–569.

  5. 5.
  6. 6.

    National Research Council, Globalization, Biosecurity and the Future of the Life Sciences (Washington, DC: National Academies Press, 2006), p. 2.

  7. 7.

    Ibid., p. 79.

  8. 8.

    Jeffrey Macher and David Mowery (ed.), Innovation in Global Industries: U.S. Firms Competing in a New World (Washington, DC: National Academies Press, 2008), p. 239.

  9. 9.

    See, for example, Tara Acharya et al. ‘Biotechnology and the UN’s Millennium Development Goals’, Nature Biotechnology, vol. 21:12 (2003), pp. 1434–1436; Abdallah S. Daar, ‘Top Ten Biotechnologies for Improving Health in Developing Countries’, Nature Genetics, vol. 32 (2002), pp. 229–232.

  10. 10.

    Charles Wessner and Alan Wolff (ed.), Rising to the Challenge: U.S. Innovation Policy for Global Economy (Washington, DC: National Academies Press, 2012), p. 38.

  11. 11.

    Burrill & Co, Biotech 2012: Innovating in the New Austerity, Burrill & Co’s 26th Annual Report on the Life Sciences Industry (San Francisco CA: Burrill and Co, 2012), p. 15. See also Burrill & Co, Biotech 2013: Capturing Value (San Francisco CA: Burrill and Co, 2013).

  12. 12.

    National Research Council, Biosecurity, Globalization and the Future of the Life Sciences, op cit., p. 79.

  13. 13.

    National Research Council, An International Perspective on Advancing Strategies for Managing Dual-Use Risks (Washington, DC: National Academies Press, 2005), p. 36.

  14. 14.

    Roger Brent, ‘In the Valley of the Shadow of Death’, DSpace@MIT, 22 November 2006, p. 3, available at (accessed 10/03/2013).

  15. 15.

    National Research Council, Biosecurity, Globalization and the Future of the Life Sciences, p. 94, op cit.

  16. 16.

    United Nations Conference on Trade and Development, The Biotechnology Promise: Capacity-Building for Participation of Developing Countries in the Bioeconomy, UNCTAD/ITE/IPC/MISC/2004/2 (New York/Geneva: United Nations, 2004), p. 1, available at (accessed 21/10/13).

  17. 17.

    See, for example, James Spohrer and Douglas Engelbart, ‘Converging Technologies for Enhancing Human Performance: Science and Business Perspectives’, Annals of the New York Academy of Sciences, vol. 1013 (2004), pp. 50–82; Alfred Nordmann, Converging Technologies – Shaping the Future of European Societies (Brussels: European Communities, 2004); Tsjalling Swierstra et al. ‘Taking Care of the Symbolic Order: How Converging Technologies Challenge Our Concepts’, Nanoethics, vol. 3 (2009), pp. 269–280; Carl Elliot, ‘Enhancement Technologies and the Modern Self’, Journal of Medicine and Philosophy, vol. 36 (2011), pp. 364–374; George Khushf, ‘The Ethics of NBIC Convergence’, Journal of Medicine and Philosophy, vol. 32 (2007), pp. 185–196; William Bainbridge, ‘Converging Technologies and Human Destiny’, Journal of Medicine and Philosophy, vol. 32 (2007), pp. 197–216; Franc Mali, ‘Bringing Converging Technologies Closer to Civil Society: The Role of Precautionary Principle’, Innovation: The European Journal of Social Science Research, vol. 22:1 (2009), pp. 53–75.

  18. 18.

    Jeffrey Macher and David Mowery (ed.), Innovation in Global Industries, op cit., p. 237.

  19. 19.

    Roger Brent, ‘In the Valley of the Shadow of Death’, p. 4, op cit.

  20. 20.
  21. 21.

    WHO, World Health Report, 2013, available at (accessed 29/01/2014).

  22. 22.

    See the ‘Infectious Diseases’ section of the WHO website: (accessed 19/01/2014).

  23. 23.

    UN FAO, Wold Agriculture: Towards 2015/2030, Summary Report, 2002, available at (accessed 19/01/2014).

  24. 24.

    For information on the NIH budget for 2012, see (accessed 29/01/2014); on estimates of funding distributions for Various Research, Condition, and Disease Categories, 2011–2016 see (accessed 21/09/2015). For an overview of the NIH budget for 2015, see David Malakoff and Jeffrey Mervis, ‘First Look: US Spending Deal a Mixed Bag for Science’, ScienceInsider, 9 December 2014, available at (accessed 16/09/2015); Jocelyn Kaiser, ‘Within NIH’s Flat 2015 Budget, a Few Favourites’, ScienceInsider, 10 December 2014, available at (accessed 16/09/2015).

  25. 25.

    For information on the EU Sixth Framework Programme and the Activity Area of Life Sciences, Genomics and Biotechnology for Health, see (accessed 29/01/2014).

  26. 26.

    See WHO, Cuba – Battling Cancer with Biotechnology, January 2013, available at (accessed 29/01/2014).

  27. 27.

    See Clive James, Global Status of Commercialised Biotech/GM Crops: 2012, Brief No.44, International Service for the Acquisition of Agri-Biotech Applications, 2012, available at (accessed 29/01/2014).

  28. 28.

    ‘Plant Genetic Engineering: China Hesitates on the Brink’, GMO Safety, 30 August 2011, available at (accessed 29/01/2014).

  29. 29.

    Charles Wessner and Alan Wolff ed., Rising to the Challenge, op cit., p. 41.

  30. 30.

    Jennifer Levin, ‘Government Academic, and Venture Firms Come Together in March to Fund Translational and Early-Stage Development’, FierceBiotech, 4 April 2012, available at (accessed 21/10/13).

  31. 31.

    Daniel Greenberg, Science for Sale: The Perils, Rewards and Delusions of Campus Capitalism (Chicago, IL: The University of Chicago Press, 2007), p. 90.

  32. 32.

    David Blumenthal et al. ‘Relationships between Academic Institutions and Industry in the Life Sciences – an Industry Survey’, The New England Journal of Medicine, vol. 334:6 (1996), pp. 368–373; Jason Owen-Smith and Walter Powell, ‘The Expanding Role of University Patenting in the Life Sciences: Assessing the Importance of Experience and Connectivity’, Research Policy, vol. 32 (2003), pp. 1695–1711; Daniel Lee Kleinman, Impure Cultures: University Biology and the World of Commerce (Madison: University of Wisconsin Press, 2003).

  33. 33.

    Hui Yang and Steven Buccola, ‘University-Industry Relationships and the Design of Biotechnology Research’, paper presented at the Annual Meeting of the American Agricultural Economics Association, Montreal, Canada, 27–30 July 2000, available at: (accessed 25/11/2013); see also Dorothy Nelkin et al. ‘University-Industry Alliances’, Science, Technology and Human Values, vol. 12:1 (1987), pp. 65–74.

  34. 34.

    See, for example, Henry Etzkowitz, ‘Entrepreneurial Science in the Academy: A Case of the Transformation of Norms’, Social Problems, vol. 36:1 (1989), pp. 14–29; Loet Leydesdorff and Martin Meyer, ‘The Triple Helix of University-Industry-Government Relations’, Scientometrics, vol. 58:2 (2003), pp. 191–203; Henry Etzkowitz, The Triple Helix: University-Industry-Government Innovation In Action (London: Routledge, 2008); Henry Etzkowitz and Loet Leydesdorff, ‘The Dynamics of Innovation: from National Systems and “Mode 2” to a Triple Helix of University–Industry–Government Relations’, Research Policy, vol. 29:2 (2000), pp. 109–123.

  35. 35.

    Daniel Greenberg, Science for Sale, op cit., p. 102; Mark Cooper, ‘Commercialisation of the University and the Problem Choice by Academic Biological Scientists’, Science, Technology, and Human Values, vol. 34:5 (2009), pp. 629–653.

  36. 36.

    Ken Auletta, ‘Get Rich U’, The New Yorker, 30 April 2012, available at (accessed 25/11/2013). See also Dina Biscotti et al. ‘The “Independent Investigator”: How Academic Researchers Construct Their Professional Identity in University-Industry Agricultural Biotechnology Research Collaborations’, in Nina Bandelj (ed.), ‘Economic Sociology of Work’, Research in the Sociology of Work Series, vol. 18 (2009), pp. 261–285; Mathias Kaisar, ‘Toward More Secrecy in Science: Comments on Some Structural Changes in Science – and on Their Implications for an Ethics of Science’, Perspectives on Science, vol. 4:2 (1996), pp. 207–230.

  37. 37.

    Ibid.; see also Daniel Greenberg Science, Money and Politics: Political Triumph and Ethical Erosion (Chicago, IL: The University of Chicago Press, 2001).

  38. 38.

    Susan Wright and David Wallace, ‘Varieties of Secrets and Secret Varieties: The Case of Biotechnology’, op cit.

  39. 39.

    Ibid., pp. 53–54.

  40. 40.

    For a detailed analysis on the US decision to reject the draft BTWC Protocol, see Malcolm Dando, Preventing Biological Warfare: The Failure of American Leadership (Basingstoke: Palgrave, 2002).

  41. 41.

    Office of Press Secretary, Fact Sheet: President Bush Signs Biodefence for the 21st Century, 28 April 2008, The White House, Washington DC, available at$File/HSPD%2010.pdf?OpenElement (accessed 2/01/2014).

  42. 42.

    Hillel Cohen, ‘The Pitfalls of Bioterrorism Preparedness: The Anthrax and Smallpox Experiences’, American Journal of Public Health, vol. 94:10 (2004), pp. 1667–1671; Michael Selgelid, ‘Bioterrorism and Smallpox Planning: Information and Voluntary Vaccination’, Journal of Medical Ethics, vol. 30 (2004), pp. 558–560.

  43. 43.

    See Susan Wright, ‘Taking Biodefence Too Far’, Bulletin of Atomic Scientists, vol. 60:6 (2004), pp. 58–66; Eileen Choffnes, ‘New Labs, More Terror’, Bulletin of Atomic Scientist, vol. 58:5 (2002), pp. 29–32.

  44. 44.

    Nick Schwellenbach, ‘A Plague of Researchers’, Bulletin of Atomic Scientists, vol. 61:3 (2005), pp. 14–16; Marylia Kelley and Jay Coghlan, ‘Mixing Bugs and Bombs’, Bulletin of Atomic Scientists, vol. 59:5 (2003), pp. 24–31; Roxanne Khamsi, ‘Lab Loses Trio of Plague Mice’, Nature News, 16 September 2005, available at (accessed 2/01/2014).

  45. 45.

    Martin Enserink and David Malakoff, ‘The Trials of Thomas Butler’, Science, vol. 302:5663 (19 December 2003), pp. 2054–2063; for a detailed account of the ‘Anthrax letters’ attack and the controversy of the US biodefence programme see Jeanne Guillemin, American Anthrax: Fear, Crime, and the Investigation of the Nation’s Deadliest Bioterror Attack (New York: Henry Hol and Co, 2011); Scott Shane, ‘Army Suspends Germ Research at Maryland Lab’, New York Times, 9 February 2009, available at (accessed 2/01/2014); Mark Wheelis and Malcolm Dando, ‘Back to Bioweapons?’, Bulletin of Atomic Scientists, vol. 59:1 (2003), pp. 40–46.

  46. 46.

    See, for example, International Health Regulations (WHO, 2005), specifically section ‘Laboratory’, available at (accessed 3/01/2014); Pandemic Influenza Preparedness Framework for the Sharing of Influenza Viruses and Access to Vaccines and other Benefits (WHO, 2011), available at (accessed 3/01/2014); Select Agent Programme (US, 2002), available at (accessed 3/01/2014); Human Tissue Act of 2004, see (accessed 3/01/2014). For further discussion on the implementation of biotechnology regulations, see Bo Sundqvist et al. ‘Harmonisation of European Laboratory Response Networks by Implementing CWA 15793: Use of Gap Analysis and an “Insider” Exercise as Tools’, Biosecurity and Bioterrorism: Biodefence Strategy, Practice, and Science, vol. 11:S1 (2013), pp. 36–44; Julie Fisher and Rebecca Katz, ‘Moving Forward to 2014: Global IHR (2005) Implementation’, Biosecurity and Bioterrorism: Biodefence Strategy, Practice, and Science, vol. 11:2 (2013), pp. 153–156. On the development of health and safety regulations on the use of nanotechnology, see Eileen Kuempel et al. ‘Risk Assessment and Risk Management of Nanoparticles in the Workplace: Translating research into Practice’, Annals of Occupational Hygiene, vol. 56:5 (2012), pp. 491–505.

  47. 47.

    See Catherine Rhodes, International Governance of Biotechnology, op cit.

  48. 48.

    David Dickson, The New Politics of Science, op cit. p. 268. There is a debate on whether the ‘American model’ of science-policy making underpinned by neoliberal ideology is fully embraced in Europe. See, for example, Gabriele Abels, ‘The Long and Winding Road from Asilomar to Brussels: Science, Policy and the Public in Biotechnology Regulation’, Science as Culture, vol. 14:4 (2005), pp. 339–353; Herbert Gottweis, ‘Transnationalising Recombinant-DNA Regulation: Between Asilomar, EMBO, the OECD, and the European Community’, Science as Culture, vol. 14:4 (2005), pp. 325–338. Further, a Policy Paper issued by a Business Taskforce appointed by the UK Government issued a Policy Paper in late 2013 demanding the liberalization of the existing EU legislation which, in their view, ‘places restrictions on products and technologies without adequate evidence of risk’. See Department for Business, Innovation & Skills and the Prime Minister’s Office, Cut the EU Red Tape: Report form the Business Task Force, Policy Paper October 2013, available at (accessed 29/01/2014).

  49. 49.

    See, for example, National Science Advisory Board for Biosecurity (NSABB), Proposed Framework for the Oversight of Dual Use Life Science Research: Strategies for Minimising the Potential Misuse of Research Information, June 2007, available at (accessed 6/01/2014). On the potential for misuse of novel scientific developments, see James Petro et al. ‘Biotechnology: Impact on Biological Warfare and Biodefence’, Biosecurity and Bioterrorism: Biodefence Strategy, Practice, and Science, vol. 1:3 (2003), pp. 161–168; Gregory Koblenz, ‘Biosecurity Reconsidered: Calibrating Biological Threats and Responses’, International Security, vol. 34:4 (2010), pp. 96–132; Christian Enemark and Ian Ramshaw, ‘Gene Technology, Biological Weapons, and the Security of Science’, Security Studies, vol. 18 (2009), pp. 624–641. Some commentators have expressed scepticism toward the claim that scientific and technological advancement poses serious threats underscoring the importance of other factors, such as socio-economic and socio-technic contexts. See, for example, Kathleen Vogel, ‘Intelligent Assessment: Putting Emerging Biotechnology Threats in Context’, Bulletin of Atomic Scientists, vol. 69:1 (2013), pp. 43–52; Sonia Ben Ouagrham-Gormley, ‘Barriers to Bioweapons: Intangible Obstacles to Proliferation, International Security, vol. 36:4 (2012), pp. 80–114. Others, however, argue that advances in modern biology and medicine have implications for the evolution of biological weapon programmes. See Malcolm Dando, ‘The Impact of the Development of Modern Biology and Medicine on the Evolution of Offensive Biological Warfare Programmes in the Twentieth Century’, Defence Analysis, vol. 15:1 (1999), pp. 43–62; Kathryn Nixdorff and Wolfgang Bender, ‘Ethics of University Research, Biotechnology and Potential Military Spin-Off’, Minerva, vol. 40 (2002), pp. 15–35.

  50. 50.

    See Ronald Jackson et al. ‘Expression of Mouse Interleukin-4 by a Recombinant Ectromelia Virus Suppresses Cytolytic Lymphocyte Responses and Overcomes Genetic Resistance to Mousepox’, Journal of Virology, vol. 75:3 (2001), pp. 1205–1210; Samatha Robins et al. ‘The Efficacy of Cidofovir Treatment of Mice Infected with Ectromelia (Mousepox) Virus Encoding Interleukin-4’, Antiviral Research, vol. 66:1 (2005), pp. 1–7; Rachel Nowak, ‘Disaster in the Making’, The New Scientist, vol. 2273, 13 January 2001; Arno Mullbacher and Mario Lobigs, ‘Creation of Killer Poxvirus Could Have Been Predicted’, Journal of Virology, vol. 75:18 (2001), pp. 8353–8355; Jeronimo Cello et al. ‘Chemical Synthesis of Poliovirus cDNA: Generation of Infectious Virus in the Absence of Natural Template’, op cit.; Steve Connor, ‘Fears of Bioterrorism as Scientists Create Deadly Polio Virus’, The Independent, 12 July 2002; Michael Selgelid, ‘A Tale of Two Studies: Ethics, Bioterrorism and the Censorship of Science’, The Hasting Centre Report, vol. 37:3 (2007), pp. 35–43; David Whitehouse, ‘First Synthetic Virus Created’, BBC News, 11 July 2002, available at (accessed 6/01/2014); Terrence Tumpey et al. ‘Characterisation of the Reconstructed 1918 Spanish Influenza Pandemic Virus’, Science, vol. 310:5745 (2005), pp. 77–80.

  51. 51.

    On the governance of dual-use research, see Christine Uhlenhaut et al. ‘Protecting Society: Biological Security and Dual-Use Dilemma in the Life Sciences – Status Quo and Options for the Future’, EMBO Reports, vol. 14:1 (2013), pp. 25–30; Catriona McLeish and Ralf Trapp, ‘The Life Sciences Revolution and the BWC: Reconsidering the Science and Technology Review Process in a Post-Proliferation world’, The Non-Proliferation Review, vol. 18:3 (2011), pp. 527–543. Some novel developments related to the way in which dual-use research is governed are worth of note. For instance, the Robert Koch Institute, the central federal institution responsible for disease control and prevention in Germany, has recently implemented an Internal Directive on Dual-Use Potential of Life Sciences Research featuring a Code of Conduct for risk Assessment and Risk Mitigation. Article 5 of the Code lists a number of activities in place for the purpose of raising awareness among the Institute staff of dual-use issues. The full text of the Code is available at (accessed 27/01/2014); another example is the licensing programme run by the Danish Centre for Biosecurity and Biopreparedness, the national authority that controls the use of dual-use materials, which features an education and outreach component. A full description of the programme is available at (accessed 27/01/2014). For a proposal on how to improve the governance of emerging technologies with far-reaching implications, see Kenneth Oye et al. ‘Regulating Gene Drives’, Science, vol. 345:6197 (2014), pp. 626–628.

  52. 52.

    Daniel Chambliss, Beyond Caring: Hospitals, Nurses and the Social Organisation of Ethics (Chicago, IL: University of Chicago Press, 1996), p. 6. See also Jim Whitman, When Dual-Use Issues Are so Abundant, Why Are Dual-Use Dilemmas so Rare, Research Report for the Wellcome Trust Project on ‘Building Sustainable Capacity in Dual-Use Bioethics’, University of Bradford, 2010, available at (accessed 29/01/2014). On the limitations of the risk-benefit framework for assessing life science research of concern, see Brian Rappert, ‘Why Has Not There Been More Research of Concern’, Frontiers in Public Health, vol. 2:74 (2014), pp. 1–14.

  53. 53.

    Daniel Chambliss, Beyond Caring op cit., p. 92.

  54. 54.

    See Marcia Barinaga, ‘Asilomar Revisited: Lessons for Today?’, Science, vol. 287:5458 (2000), pp. 1584–1585; Sheldon Krimsky, ‘From Asilomar to Industrial Biotechnology: Risks, Reductionism and Regulation’, Science as Culture, vol. 14:4 (2005), pp. 309–323.

  55. 55.

    See Malcolm Dando and Brian Rappert, ‘Codes of Conduct for Life Sciences: Some Insights from UK Academia’, Briefing Paper no.16 (2nd series: 2005), University of Bradford, available at (accessed 8/03/2013). See also German Ethics Council, Biosecurity – Freedom and Responsbility of Research (Berlin: Deutscher Ethikrat, 2014), available at (accessed 17/09/2015).

  56. 56.

    Charles Perrow, Normal Accidents: Living with High-Risk Technologies, op cit., p. 12. On the framing of risk in biotechnology, see Geert van Calster, ‘Risk Regulation, EU Law and Emerging Technologies: Smother or Smooth?’, Nanoethics, vol. 2 (2008), pp. 61–71; Jenifer Kuzma and John Besley, ‘Ethics of Risk Analysis and Regulatory Review: From Bio- to Nanotechnology’, Nanoethics, vol. 2 (2008), pp. 149–162; Les Levidow et al. ‘European Biotechnology Regulation: Framing the Risk Assessment of a Herbicide Tolerant Crop’, Science, Technology, and Human Values, vol. 22:4 (1997), pp. 472–505; Lisa Clark, ‘Framing the Uncertainty of Risk: Models of Governance for Genetically Modified Crops’, Science and Public Policy, vol. 40 (2013), pp. 479–491; Jesper Toft, ‘Denmark’s Regulation of Agri-Biotechnology: Co-Existence Bypassing Risk Issues’, Science and Public Policy, vol. 32:4 (2005), pp. 293–300; Ereck Chakauyaa et al. ‘Riding the Tide of Biopharming in Africa: Considerations for Risk Assessment’, South African Journal of Science, vol. 102 (2006), pp. 284–288; Jean-Michel Marcoux and Lyne Le´tourneau, ‘A Distorted Regulatory Landscape: Genetically Modified Wheat and the Influence of Non-Safety Issues in Canada’, Science and Public Policy, vol. 40 (2013), pp. 514–532; Shawn Harmon et al. ‘Governing Risk, Engaging Public an Engendering Trust: New Horizons for Law and Social Science’, Science and Public Policy, vol. 40 (2013), pp. 25–33; Maaike van Tuyll, ‘Dealing with Future Risks in the Netherlands’, Biosecurity and Bioterrorism: Biodefense Strategy, Practice, and Science, vol. 11:S1 (2013), pp. 555–563. On the shortcomings of the existing models for public deliberation on the risks of biotechnology, see Les Levidow, ‘European Public Participation as Risk Governance: Enhancing Democratic Accountability for Agrobiotech Policy’, East Asian Science, technology and Society: an International Journal, vol. 1 (2007), pp. 19–51.

  57. 57.

    David Dickson, The New Politics of Science, op cit., p.285. For critique of cost-benefit analysis, see also Brian Rappert, ‘The Benefits, Risks and Threats of Biotechnology’, Science and Public Policy, vol. 35:1 (2008), pp. 1–7.

  58. 58.

    Ibid., 286.

  59. 59.
  60. 60.

    See Tatyana Novossiolova et al. ‘The Creation of Contagious H5N1 Avian Influenza Virus: Implications for the Education of Life Scientists’, Journal of Terrorism Research, vol. 3:1 (2012), pp. 39–51.

  61. 61.

    All meetings of the US National Science Advisory Board for Biosecurity (NSABB) convened to discuss the manuscripts were restricted to selected individuals and full proceedings were never published. Moreover, the consequential Consultation Meeting organised by the World Health Organisation (WHO) in February 2012 which rejected the NSABB recommendation for a redacted publication of the manuscripts featured the lead scientists who conducted the experiments and representatives of the Us National Institutes of Health, the primary funding body of both studies. See Gretchen Vogel, ‘Flu Experts – and One Ethicist – Debate Controversial H5N1 Papers’, ScienceInsider, 16 February 2012, available at (accessed 27/01/2014). The full list of participants and final report of the Meeting are available at (accessed 27/01/ 2014).

  62. 62.

    For more information, see (accessed 16/09/2015); Martin Enserink, ‘Fight Over Dutch H5N1 Paper Enters Endgame’, ScienceInsider, 24 April 2012, available at (accessed 6/01/2014).

  63. 63.

    See Martin Enserink, ‘Fight Over Dutch H5N1 Paper Enters Endgame’, ScienceInsider, 24 April 2012, available at (accessed 17/09/2015); Martin Enserink, ‘Dutch Appeals Court Dodges Decision on Hotly Debated H5N1 Papers’, ScienceInsider, 16 July 2015, available at (accessed 17/09/2015).

  64. 64.

    Letter from Michael Osterholm to Amy Patterson, 12 April 2012; see also Brendan Maher, ‘Bias Accusation Rattles US Biosecurity Board’, Nature News, 14 April 2012, available at (accessed 6/01/2014).

  65. 65.

    See Masaki Imai et al. ‘Experimental Adaptation of an Influenza H5 HA Confers Respiratory Droplet Transmission to a Reassortant H5 HA/H1N1 Virus in Ferrets’, Nature, vol. 486 (2012), pp. 420–430; Sander Herfst et al. ‘Airborne Transmission of Influenza A/H5N1 Virus between Ferrets: Materials/Methods, Supporting Text, Tables, Figures, and/or Refences’, Science, vol. 336:6088 (2012), pp. 1534–1541. Supplementary materials are available at (accessed 27/01/2014).

  66. 66.

    Pallab Ghosh, ‘“Safety Incidents” at Animal Lab’, BBC News, 26 May 2011, available at (accessed 8/01/2014).

  67. 67.

    Alison Young, ‘Airflow Problems Plague CDC Bioterror Lab’, USA Today, 12 June 2012, available at (accessed 8/01/2014).

  68. 68.

    US Government Accountability Office, High-Containment Laboratories: Assessment of the Nation’s Need Is Missing, 25 February 2013, available at (accessed 8/01/2014).

  69. 69.

    Robert Walgate, ‘SARS Escaped Beijing Lab Twice’, The Scientist, 26 April 2004, available at (accessed 8/01/2014); Lawrance Altman, ‘Lab Infection Blamed for Singapore SARS Case’, New York Times, 24 September 2003, available at (accessed 8/01/2014).

  70. 70.

    Alison Young, ‘Security Lapses Found at CDC Bioterror Lab’, US Today, 27 June 2012, available at (accessed 8/012014). Between 2014 and 2015, several high-containment facilities in the US have experienced serious biosafety lapses. Accidents involving dangerous pathogens such as the causative agents of anthrax and bird flu were reported. See Centers for Disease Control and Prevention, Report on the Potential Exposure to Anthrax, 7 November 2014, available at (accessed 17/09/2015); Editorial, ‘Biosafety in the Balance’, Nature, 25 June 2014. Available at (accessed 17/09/2015); Marc Lipsitch, ‘Anthrax? That’s Not the Real Worry’, New York Times, 29 June 2014, (accessed 17/09/2015); Donald McNeil, ‘CDC Closes Anthrax and Flu Labs After Accidents’, New York Times, 11 July 2014, (accessed 17/09/2015); Ian Sample, ‘From Anthrax to Bird Flu – the Dangers of Lax Security in Disease-Control Labs’, The Guardian, 18 July 2014, (accessed 17/09/2015). About the same time, there were also reports about smallpox vials being retrived after having been left unaccounted for over 50 years. See AP, ‘Forgotten Vials of Smallpox Found in Storage Room’, New York Times, 8 July 2014, (accessed 17/09/2015). Most recently in 2015 there have been reports about live anthrax being shipped from US military research facilities worldwide. See Nicky Woolf, ‘Anthrax Shipment from Pentagon the result of a “Massive Institutional Failure”’, The Guardian, 23 July 2015, (accessed 17/09/2015).

  71. 71.

    Letter from the Foundation for Vaccine Research to the European Commission, Response to Letter by the European Society for Virology on ‘Gain-of-Function’ Influenza Research and Proposal to Organise a Scientific Briefing for the European Commission and Conduct Comprehensive Risk-Benefit Assessment, 18 December 2013, available at!/file/vaccine%20foundation%20letter.pdf (accessed 8/01/2014). On the ongoing debate on GOF, see National Research Council, Potential Risks and Benefits of Gain-of-Function Research: Summary of a Workshop (Washington, DC: National Academies Press, 2015).

  72. 72.

    Philip Sharp et al. The Third Revolution: The Convergence of the Life Sciences, Physical Sciences and Engineering (Cambridge, MA: MIT White Paper, January 2011).

  73. 73.

    See Committee on Biomocular Materials and Processes, National Research Council, Inspired by Biology: From Molecules to Materials to Machines (Washington, DC: National Academies Press, 2008).

  74. 74.

    National Research Council, A New Biology for the 21st Century (Washington, DC: The National Academies Press, 2009), p. 3.

  75. 75.

    Ibid., p. 42.

  76. 76.
  77. 77.

    See Kinam Park, ‘Nanotechnology: What It Can Do for Drug Delivery’, Journal of Control Release, vol. 120:1–2 (2008), pp. 1–3; Kinam Park, ‘Facing the Truth about Nanotechnology in Drug Delivery’, ACS Nano, vol. 7:9 (2013), pp. 7442–7447; Suwussa Bamrungsap et al. ‘Nanotechnology in Therapeutics’, Nanomedicine, vol. 7:8 (2012), pp. 1253–1271; ‘Carbon Nanotubes – Bullets in the Fight against Cancer’, Community Research and Development Information Service, 10 September 2013, available at (accessed 13/01/2014).

  78. 78.

    On the governance challenges brought about by the convergence between biology and other fields of science, see Francis Fukuyama and Caroline Wagner, Information and Biological Revolutions: Global Governance Challenges – Summary of a Study Group (Washington DC: RAND Corporation, 2000), available at (accessed 29/01/2014); Spiez Convergence, Report of the First Workshop, 6–9 October 2014, available at (accessed 17/09/2015).

  79. 79.

    See Jonathan Tucker, ‘The Body’s Own Bioweapons’, The Bulletin of Atomic Scientists, vol. 64:1 (2008), pp. 16–22.

  80. 80.

    See Jonathan Tucker (ed.), Innovation, Dual Use, and Security: Managing the Risks of Emerging Biological and Chemical Technologies (Cambridge, MA: MIT Press, 2012); US National Academy of Science, Life Sciences and Related Fields: Trends Relevant to the Biological Weapons Convention (Washington, DC: National Academies Press, 2011).

  81. 81.

    Katherine Gammon, ‘Building Better Implants’, MIT Technology Review, 28 September 2007, available at (accessed 14/01/2014).

  82. 82.

    Kevin Bullis, ‘Some Nanotubes Could Cause Cancer’, MIT Technology Review, 22 May 2008, available at (accessed 14/01/2014).

  83. 83.

    ‘First Self-Replicating Synthetic Bacterial Cell’, Press Release, J. Craig Venter Institute, 20 May 2010, available at (accessed 14/01/2014).

  84. 84.

    Ian Sample, ‘Craig Venter Creates Synthetic Life Form’, The Guardian, 20 May 2010, available at (accessed 14/01/2014); ETC Group, ‘Synthia is Alive…and Breeding: Panacea or Pandora’s Box?’, News Release, 20 May 2010, available at (accessed 29/01/2014).

  85. 85.

    On the security implications of synthetic biology, see International Council for the Life Sciences, Security Aspects of Synthetic Biology, report of a Meeting, 5–7 2012, Heidelberg, Germany; International Council for the Life Sciences, Security Aspects of Synthetic Biology, report of a Meeting, 7–8 March 2013, Hong Kong, both available at (accessed 28/01/2014); Alexander Kelle, Synthetic Biology and Biosecurity Awareness in Europe, Bradford Science and Technology Report No.9, November 2007, available at (accessed 28/01/2014); UNICRI, Security Implications of Synthetic Biology and Nanobiotechnology: A Risk and Response Assessment of Advances in Biotechnology (Turin: UNICRI, 2012), available at (accessed 28/01/2014). On the social and ethical aspects of synthetic biology, see Presidential Commission for the Study of Bioethical Issues, New Directions: The Ethics of Synthetic Biology and Emerging Technologies, December 2010, available at (accessed 28/01/2014); Andrew Balmer and Paul Martin, Synthetic Biology: Social and Ethical Challenges, May 2008, Institute for Science and Society, University of Nottingham, available at (accessed 28/01/2014).

  86. 86.

    See Kathleen Vogel, ‘Framing Biosecurity: An alternative to the Biotech Revolution Model’, Science and Public Policy, vol. 35:1 (2008), pp. 45–54; James Revill and Catherine Jefferson, ‘Tacit Knowledge and the Biological Weapons Regime’, Science and Public Policy (2013), pp. 1–14. Vogel’s view are contested in Jonathan Tucker ‘Could Terrorists Exploit Synthetic Biology’, The New Atlantis, No.31 (Spring 2011), pp. 69–81.

  87. 87.

    James Randerson, ‘Lax Laws, Virus DNA and Potential for Terror’, The Guardian, 14 June 2006, available at (accessed 14/01/2014). On the issue of commercial order screenings, see Stephen Maurer et al. Making Commercial Biology Safer: What the Gene Synthesis Industry Has Learned about Screening Customers and Orders, Working Paper, 17 September 2009, available at (accessed 28/01/2014); Michele Garfinkel et al. Synthetic Genomics: Options for Governance, October 2007, available at (accessed 28/01/2014). On the governance of synthetic biology, see Catherine Lyall, ‘Governing Genomics: New Governance Tools for New Technologies’, Technology Analysis and Strategic Management, vol. 19:3 (2007), pp. 369–386; Hans Bugl et al. ‘DNA Synthesis and Biological Security’, Nature Biotechnology, vol. 25:6 (2007), pp. 627–629; Stephen Maurer and Sebastian von Engelhardt, ‘Industry Self-Governance: A New Way to Manage Dangerous Technologies’, Bulletin of the Atomic Scientists, vol. 69:3 (2013), pp. 53–62; Jennifer Kuzma and Todd Tanji, ‘Unpacking Synthetic Biology: Identification of Oversight Policy Problems and Options’, Regulation and Governance, vol. 4 (2010), pp. 92–112; Filippa Lentzos, ‘Synthetic Biology, Security and Governance’¸ BioSocieties, vol. 7:4 (2012), pp. 339–351.

  88. 88.

    Markus Schmidt and Gregor Giersch, ‘DNA Synthesis and Security’, in Marissa Campbell (ed.), DNA Microarrays, Synthesis and Synthetic DNA (Nova Science Publishers, 2012), p. 296.

  89. 89.

    See Paul Rincon, ‘Science Enters $1,000 Genome Era’, BBC News, 15 January 2014, available at (accessed 27/01/2014); Carole Cadwalladr, ‘What Happened When I Had My Genome Sequenced’, The Observer, 8 June 2013, available at (accessed 9/01/2014); Julia Kollowe, ‘DNA Machine Can Sequence Human Genomes in Hours’, The Guardian, 17 February 2012, available at (accessed 9/01/2014). For information on the Human Genome Project, see (accessed 9/01/2014). For information about commercial companies offering full-genome sequencing, see (accessed 9/01/2014). For an overview of the developments in genome-based therapy, see Steve Olson and Adam Berger, Genome-Based Diagnostics: Clarifying Pathways to Clinical Use: Workshop Summary (Washington, DC: National Academies Press, 2012); Adam Berger and Steve Olson, Genome-Based Therapeutics: Targeted Drug Discovery and Development: Workshop Summary (Washington, DC: National Academies Press, 2012).

  90. 90.

    Peter Walker, ‘DNA of 100,000 People to be Mapped for NHS’, The Guardian, 10 December 2012, available at (accessed 9/01/2014).

  91. 91.

    See Q. Tian et al. ‘Systems Cancer Medicine: Towards Realisation of Predictive, Preventive, Personalised and Participatory (P4) Medicine’, Journal of Internal Medicine, vol. 271 (2012), pp. 111–121; Ben Quinn, ‘Angelina Jolie “Grateful and Moved” by Reaction to Her Mastectomy Decision’, The Guardian, 2 June 2013, available at (accessed 9/01/2014).

  92. 92.

    Andrew Pollack, ‘DNA Blueprint for Fetus Built Using Tests of Parents’, New York Times, 6 June 2012, available at (accessed 9/01/2014).

  93. 93.

    Ian Sample, ‘“Three-Parent Babies” Cure for Illness Raises Ethical Fear’, The Guardian, 5 June 2012, available at (accessed 9/01/2014).

  94. 94.

    See Carole Cadwalladr, ‘What Happened When I Had My Genome Sequenced’, op cit.

  95. 95.

    Ibid. Another point that Cadwalladr raises is the danger of a negative placebo effect whereby doubts about certain genetic disorder may lead to psychosomatic symptoms.

  96. 96.

    Daniel Kevles, ‘From Eugenics to Patents: Genetics, Law, and Human Rights’, Annals of Human Genetics, vol. 75:3 (2011), p. 330.

  97. 97.

    See Abdul-Karrem Ahmed, ‘Unhidden Traits: Genomic Data Privacy Debates Heat Up’, Scientific American, 14 August 2013, available at (accessed 27/01/2014).

  98. 98.

    Daniel Kevles, ‘From Eugenics to Patents’, op cit., p. 330.

  99. 99.

    Ibid., p. 331. See also Harriet Washington, Deadly Monopolies: The Shocking Corporate Takeover of Life Itself – and the Consequences for Your Health and Our Medical Future (New York: Doubleday, 2011); Sheila Jasanoff ‘Taking Life: Private Rights in Public Nature’ in Kaushik Sunder Rajan (ed.), Lively Capital: Biotechnologies, Ethics, and Governance in Global Markets (Duke University Press, 2012), pp. 155–184.

  100. 100.

    See National Research Council, Globalization, Biosecurity, and the Future of the Life Sciences, op cit.; Malcolm Dando, ‘Benefits and Threats of Developments in Biotechnology and Genetic Engineering’ in SIPRI Yearbook, Armaments, Disarmament and International Security (Oxford: Oxford University Press, 1999). On Ethnic gene markers, see Alice Roberts, The Incredible Human Journey: The Story of How We Colonised the Planet (London: Bloomsbury, 2009); Mark Shriver et al. ‘Ethnic-Affiliation Estimation by Use of Population-Specific DNA Markers’, American Journal of Human Genetics, vol. 60 (1997), pp. 957–964; Alastair Wood, ‘Racial Differences in the Response to Drugs – Pointers to Genetic Differences’, New England Journal of Medicine, vol. 344:18 (2001), pp. 1393–1395.

  101. 101.

    Robert Wright, ‘The Achilles’ Helix’, New Republic, vol. 203:2–3 (1990), pp. 21–26.

  102. 102.

    Daniel Kevles, ‘From Eugenics to Patents’, op cit., p. 330.

  103. 103.

    Ian Sample, ‘Britain Ponders “Three-Person” Embryos to Combat Genetic Disease’, The Guardian, 20 March 2013, available at (accessed 10/01/2014).

  104. 104.

    See Department of Health and Human Fertilisation and Embryology Authority, Innovative Genetic Treatment to Prevent Mitochondrial Disease, Press Release, 28 June 2013, available at (accessed 10/01/2014); Peter Saunders, ‘Three-Parent Embryos for Mitochondrial Disease? Twelve Reasons for Caution’, LifeSiteNews, 28 June 2013, available at (accessed 10/01/2014); Ian Sample, ‘“Three-Parent” Babies Explained: What Are the Concerns and Are They Justified?’, The Guardian, 2 February 2015, available at (accessed 22/09/2015). In February 2015, the UK passed legislation allowing the use of the technique. See James Gallagher, ‘UK Approves Three-Person Babies’, BBC News, 24 February 2015, available at (accessed 22/09/2015).

  105. 105.

    See Ronald Green, ‘Building Baby from the Genes Up’, The Washington Post, 13 April 2008, available at (accessed 10/01/2014).

  106. 106.

    US National Academy of Science, Life Sciences and Related Fields, op cit., p. 63. One commentator distinguishes between ‘Big Science’ which was ‘to-down, hierarchical, vertical’ and ‘networked science’ characterised by ‘open systems, open software, open participation’. See Diane Rhoten, ‘The Dawn of Networked Science’, Chronicle of Higher Education, vol. 54:2 (2007), pp. 78–90. On the changing patterns of science collaboration, see James Porter, ‘Changing Dynamics of Collaboration in Life Sciences’, Science as Culture, vol. 22:3 (2013), pp. 388–393.

  107. 107.
  108. 108.

    See, for example, Marcus Wohlsen, Biopunk: Solving Biotech’s Biggest Problems in Kitchens and Garages (New York: Penguin, 2011); Heidi Ledford, ‘Life Hackers’, Nature, vol. 467 (2010), pp. 650–652; Editorial, ‘Garage Biology’ Nature, vol. 467 (2010), p. 634.

  109. 109.

    Robert Carlson, Biology Is Technology: The Promise, Peril, and New Business of Engineering Life (Cambridge, MA: Harvard University Press, 2010).

  110. 110.

    For more information, see (accessed 14/01/2014).

  111. 111.

    For more information, see (accessed 14/02/2014); (accessed 14/02/2014).

  112. 112.

    See, for example, Markus Schmidt, ‘Diffusion of Synthetic Biology: A Challenge to Biosafety’, Systems and Synthetic Biology, vol. 2:1–1 (2008), pp. 1–6; Markus Schmidt, ‘Do I Understand What I Can Create: Biosafety Issues in Synthetic Biology’, in Markus Schmidt et al. (ed.), Synthetic Biology: The Technoscience and Its Societal Consequences (Dordrecht: Springer, 2010).

  113. 113.

    ‘Charge Dropped against Artist in a Terror Case’, The Associated Press, 22 April 2008, available at (accessed 14/02/2014).

  114. 114.

    See Edward Lempinen, ‘FBI, AAAS Collaborate on Ambitious Outreach to Biotech Researchers and DIY Biologists’, AAAS News, 1 April 2011, available at (accessed 28/02/2014). Alternative governance frameworks include: National Science Advisory Board for Biosecurity (NSABB), Strategies to Educate Amateur Biologists and Scientists in Non-Life Science Disciplines About Dual Use Research in the Life Sciences, June 2011, available at (accessed 28/02/2014); Catherine Jefferson, Governing Amateur Biology: Extending Responsible Research and Innovation in Synthetic Biology to New Actors, Research Report for the Wellcome Trust Project ‘Building Sustainable Capacity in Dual Use Bioethics’, 2013, available at (accessed 28/02/2014).

  115. 115.

    See Jerry Seper, ‘Secret Project Manufactured Mock Anthrax’, The Washington Times, 26 October 2001, available at (accessed 28/02/2014); Judith Miller et al. Germs: Biological Weapons and America’s Secret War (New York: Simon & Schuster, 2001), pp. 297–299.

  116. 116.

    Jim Whitman, ‘Global Governance and the Twenty-First Century Technology’, in Brian Rappert (ed.), Technology and Security: Governing Threats in the New Millennium (Basingstoke: Palgrave, 2007), p. 96.

  117. 117.

    Mihail Roco and William Sims Bainbridge, Converging technologies for Improving Human Performance: Nanotechnology, Biotechnology, Information technology and Cognitive Science (Washington, DC: National Science Foundation, 2002), p. 30.

  118. 118.

    Report of the WHO Informal Consultation on Dual Use Research of Concern, 26–28 February 2013, Geneva Switzerland, available at (accessed 28/01/2014).

  119. 119.

    Alexander Kelle et al. Preventing a Biochemical Arms Race (Stanford: Stanford University Press, 2013),  chapter 5; Jonathan Tucker, ‘Biological Threat Assessment: Is the Cure Worse than the Disease?, Arms Control Today, vol. 34 (October 2004), available at (accessed 20/01/2014).

  120. 120.

    Julian Perry Robinson, ‘Difficulties Facing the Chemical Weapons Convention’, International Affairs, vol. 84:2 (2008), p. 228; Michael Crowley, Dangerous Ambiguities: Regulation of Riot Control Agents and Incapacitants under the Chemical Weapons Convention, Bradford Non-Lethal Weapons Research Project, October 2009, University of Bradford. Available at (accessed 21/07/2015).

  121. 121.

    Statement By Australia, Weaponisation of Central Nervous System Acting Chemicals for Law Enforcement Purposes, XIX Session of the Conference of the States Parties, 1–5 December 2014, OPCW, the Hague, the Netherlands, available at (accessed 21/07/15).

  122. 122.

    Artem Krechetnikov, ‘Moscow Theatre Siege: Questions Remain Unanswered’, BBC News, 24 October 2012, available at (accessed 28/01/2014).

  123. 123.

    Registry of the European Court of Human Rights, Press Release: Use of Gas against Terrorists during the Moscow Theatre Siege Was Justified, but the Rescue Operation afterwards Was Poorly Planned and Implemented, ECHR 295 (2011), 20 December 2011.

  124. 124.
  125. 125.

    See [in Russian] ‘Chto eto bylo? Spasenie zalozhnikov ili unichtozhenie terroristov?’, Novaya Gazeta, No.86, 21 November 2002, available at (accessed 6/02/16); John Dunlop, The 2002 Dubrovka and 2004 Beslan Hostage Crises: A Critique of Russian Counter-Terrorism (Stuttgard: Ibidem-Verlag, 2006).

  126. 126.

    [in Russian] Vladimir Bogdanov, ‘Sekretov bol’she net’, Rossiskaya Gazeta, No. 5917, 23 October 2012, available at (accessed 6/02/16).

  127. 127.

    Julian Perry Robinson, ‘Difficulties Facing the Chemical Weapons Convention’, op cit., pp. 226–227. See also National Research Council, Avoiding Surprise in an Era of Global Technology Advances (Washington, DC: National Academies Press, 2005), particularly  chapter 6.

  128. 128.

    Jim Whitman, ‘The Challenge to Deliberative Systems of Technological Systems Convergence’, Innovation: The European Journal of Social Science Research, vol. 20:4 (2007), p. 330.

  129. 129.

    Ibid., p. 336.

  130. 130.

    Mihail Roco and William Sims Bainbridge, Converging Technologies for Improving Human Performance, op cit., pp. 39–40.

  131. 131.

    See Alexander Kelle et al. Preventing a Biochemical Arms Race, op cit., p. 60; Julian Perry Robinson, ‘Bringing the BWC Conventions Closer Together’, The CBW Conventions Bulletin, Issue 80 (2008), pp. 1–4, available at (accessed 20/01/2014).

  132. 132.

    See Ying Zhang et al. ‘H5N1 Hybrid Viruses Bearing 2009/H1N1 Virus Genes Transmit in Guinea Pigs by Respiratory Droplet’, Science, vol. 340:6139, pp. 1459–1463.

  133. 133.

    Maria Zhu et al. ‘Infectivity, Transmission, and Pathology of Human-Isolated H7N9 Influenza Virus in Ferrets and Pigs’, Science, vol. 34:6142 (2013), pp. 183–186; Mathilde Richard et al. ‘Limited Airborne Transmission of H7N9 Influenza a Virus Between Ferrets, Nature, vol. 501 (2013), pp. 560–563.

  134. 134.

    Nan Goodman, Shifting the Blame: Literature, Law, and the Theory of Accidents in Nineteenth-Century America (Princeton, NJ: Princeton University Press, 1998), p.145.

  135. 135.

    Charles Perrow, Normal Accidents, op cit.

  136. 136.

    Jose-Luis Sagripanti, ‘Building a Bio World’, CBRNe World, October 2013, p.48, available at (accessed 6/08/17).

  137. 137.

    See Nan Goodman, Shifting the Blame, op cit., p.147.

  138. 138.

    Geoffrey Vickers, Freedom in a Rocking Boat: Changing Values in an Unstable Society (London: Penguin Press, 1970), p. 127.

  139. 139.

    Geoffrey Vickers, Human Systems Are Different, op cit., p. 146; on the deleterious effects of rapid change that precludes the preservation of culture, see Helena Norberg-Hodge, ‘Learning from Ladakh: A Passionate Appeal for ‘Counter-Development’, Earth Island Journal, vol. 7:2 (1992), Jared Diamond, Collapse: How Societies Choose to Fail or Survive (London: Penguin Books, 2005). Some commentators have critiqued the work of Diamond on the grounds of simplicity. For a summary of some of the criticisms levelled at his work, see Eric Powell, ‘Do Civilisations Really Collapse’, Archaeology, vol. 61:2 (2008).

Copyright information

© The Author(s) 2017

Authors and Affiliations

  1. 1.Faculty of Social SciencesUniversity of BradfordBradfordUnited Kingdom

Personalised recommendations