Keywords

1 Introduction

Food Security is defined by the UN Food and Agricultural Organization (FAO) to exist when “all people, at all times, have physical and economic access to sufficient, safe and nutritious food.” Global (public) health security on the other hand is defined by WHO (World Health Organization) as “the activities required, both proactive and reactive, to minimize vulnerability to acute public health events that endanger the collective health of populations.”

Food Security, as seen within a health security context, therefore relates to the systems to be put in place to deal with acute events related to foodborne hazards, be they chemical or microbiological in nature. However, it should be noted that systems aimed at prevention of acute foodborne events (outbreaks) are not inherently different from the systems dealing with the prevention of foodborne events in general, i.e. the same systems that are used to deal with outbreaks are also used to deal with sporadic foodborne cases, and are also – at least in principle – dealing with chronic foodborne disease.

While the focus of surveillance systems – and especially such systems focused on acute risks – are often primarily focused on microbiological hazards, chemical hazards in general constitute a very significant part of food safety problems. This is why systems aimed at providing data for food contamination and foodborne disease prevention must consider both microbiological and chemical hazards.

The description of food security systems in this chapter will hence include all these areas – recognizing that in most countries there is only one system dealing with the prevention of foodborne diseases, typically governed by a food safety authority, in some cases several authorities. The food safety regulatory system includes oversight of both microbiological and chemical hazards, which again both can cause acute as well as chronic disease events.

This Chapter will include a description of existing (national and international) surveillance systems, the existing models for food safety risk assessment as well as examples of risk mitigation action within the most recent food safety systems development for both chemical and microbiological hazards (physical hazards can also be important but generally constitute a minor proportion of food safety problems).

With food security issues relating to ‘sufficient food’, the disasters related to famine need to be included also. While it is generally recognized that the world food production capacity is certainly sufficient even for a global population of 10 billion, it is more questionable if the present methods of production are sustainable, for instance relative to water and land use, phosphorus and nitrogen flows, CO2 and NH3 contamination, etc. Thus, the most important issue related to food production becomes that food systems must be transformed to produce more nutritious food with a lower environmental footprint [1].

Therefore, food security within a health security context needs to also include reflexions on sustainability of food production, both relative to assessment systems and mitigation action. The Chapter will therefore also describe the relevant methodology for quantitative sustainability assessment, as well as the concept of global sustainability limits (boundaries) and potential action within agri- and aquaculture production systems to improve sustainability.

The sustainability of food production is not only related to the environment but can overlap with the safety of the food produced. Several examples of such interactions will be discussed, including one important example related to both human and animal health which is the increase in antimicrobial resistance (AMR) of foodborne microorganisms. The increase in the number of multi-resistant bacteria (resistant to several antimicrobials) threatens a return to the pre-antibiotic era, where a simple scratch or a sore throat could be life-threatening. It is estimated that 700,000 die globally every year from AMR microorganisms, and that this figure will escalate to 10 million by 2050 (the global death toll from cancer is 8 million) [2]. Which relative fraction of this problems that is caused by the animal use of antimicrobials is still debated but it is likely to be significant. The Chapter will describe existing (national and international) surveillance systems as well as examples of risk mitigation action.

In relation to the future of global surveillance of communicable (including foodborne) diseases, the present and future use of Next Generation Sequencing (NGS) will be described. The sharing of Whole Genome Sequences (WGS) of all microorganisms gives us a potential to develop standardized global surveillance of all microorganisms as well as AMR, providing a basis for global, regional and national ‘One Health’ interventions to analyze, control and minimize the problem.

2 Food Safety Monitoring and Surveillance, Risk Assessment and Risk Mitigation

Decisions about policies aimed at preventing food contamination and foodborne illness have become very important in general, both nationally and internationally. Infectious disease surveillance systems are used both in relation to human and animal health. These systems are typically set up to collect data on the occurrence of diseases in humans and/or animals, thereby enabling identification of outbreaks, tracking the spread of diseases and providing early warning for national as well as international human and animal health institutions. Food Safety surveillance systems are focused on either food contamination, typically referred to as food monitoring, or foodborne disease surveillance.

Recent experience has shown that these traditional systems are not always effective nor timely in relation to alerting officials to newly emerging foodborne or zoonotic diseases: diseases transmitted between animals and humans [3]. Examples include HIV/AIDS, variant Creutzfeldt-Jakob Disease, Highly Pathogenic Avian Influenza (HPAI) such as H5N1 and H1N1 (pandemic) AI virus. Zoonotic disease outbreaks seem to be increasing in number. Out of 175 pathogenic microbiological species considered to be ‘emerging’, it is estimated that 132 (75%) are zoonotic, and overall, zoonotic pathogens are twice as likely to be associated with emerging diseases than non-zoonotic pathogens [4].

Most of all, important zoonoses relate in some way to animals in the food production chain. Therefore food becomes an important vehicle for many zoonotic pathogens. Zoonotic diseases related to food animals can be separated into three groups [5]. In the first group are diseases with a potential for global spread and with a dramatic public relations potential, often these diseases have a large human reservoir showing some level of human-human transmission, e.g. SARS, HPAI (H1N1) and certain types of Antimicrobial Resistant (AMR) bacteria. The second group relates to the industrialized food production chain, such as Salmonella and Campylobacter, human pathogens that are often non-pathogenic in animals and seem to be distributed in all countries, both rich and poor. In the third group are the ‘neglected zoonotic diseases’. They are zoonotic diseases which have been eradicated (or drastically reduced) in affluent economies through vaccination, culling policies, and/or introduction of better animal management practices. However, in many poor settings these diseases are ‘neglected diseases’ and receive very little attention from national authorities or even international organizations. This group includes Brucella, bovine TB (tuberculosis), and many parasitic diseases, e.g. leishmaniasis and cysticercosis.

In addition to the factors described above, food production and food trade is now more and more global, and thus some of the food related problems are also global food problems. On the positive side globalization has helped with some of the important global food issues: it has helped deal with – at least to some degree – food insecurity in the most dramatic form, i.e. famine. To the extent that we still do have famine occuring in certain regions it is more an outcome of (political) inability to distribute the food that we actually produce.Footnote 1 In recent decades the occurrence of major famines has diminished significantly and abruptly as compared to earlier eras.

However, together with the food also the foodborne diseases now travel the globe. And if we do not stay on top of the problem, disease outbreaks might affect large parts of the global food sector negatively, in the end leading to significant negative health impact – but will also cause negative financial and socio-economic effects. A more holistic and pro-active approach to food safety and disease surveillance may help prevent future food disasters and in the process build healthy economies.

One of the major issues related to regulatory action in food safety over the latest decades has been the lack of cross-sectoral collaboration across the food production chain. Major food safety events have been significantly affected by the lack of collaboration between the animal health, the food control, and the human health sector. This led to renewed international action, or maybe more correctly put: it led to discussion about how this apparent lack of coordination could be mitigated, which resulted in the creation of the (somewhat) novel concept of ‘One Health’.

2.1 The Need for One Health Surveillance: The Zoonotic Influenza Virus Examples

It was primarily the outbreaks of SARS (Severe Acute Respiratory Syndrome), zoonotic influenza, and BSE (Bovine Spongiform Encephalopathy) which alerted the world to the need for a One Health approach. Outbreaks of viral diseases in humans, originating in or spreading through farm animals (avian flu – influenza A(H5N1) and swine flu – influenza A(H1N1)) have caused major global alerts in the last decades. These influenza outbreaks spread very quickly, either in the animal population (H5N1) or directly in the human population (H1N1), and formed a global threat for human health. H1N1 was therefore characterized by the WHO as a pandemic. Although, in total the human disease burden related to the endemic bacterial zoonoses is probably many fold higher than these influenza outbreaks, it is basically these relatively few but fast spreading outbreaks that have put One Health on the global agenda. In addition, the failure to predict, monitor and control the spread of these diseases in animals presented regulators and politicians with a wake-up call, and made them demand (better) cross-sectoral collaboration between the animal and human health sectors [5].

Avian influenza (AI), caused by the influenza A virus, is one of several zoonotic influenza diseases. Although WHO for some time – and maybe under pressure from major pork producers – maintained that swine influenza should not be characterized as such, most scientists (including WHO) now refer to these influenza types as per their ‘natural’ host. Humans can be infected with avian, swine and other zoonotic influenza viruses, such as avian influenza virus subtypes A(H5N1), A(H7N9), and A(H9N2) and swine influenza virus subtypes A(H1N1), A(H1N2) and A(H3N2). The majority of human cases of avian influenza A virus infections have been associated with direct or indirect contact with infected live or dead poultry. Controlling the disease in the animal source is critical to decrease risk to humans. Zoonotic influenza infection in humans will continue to occur, notably from avian and other animal sources. To minimize public health risk, surveillance in both animal and human populations is essential.

Avian Influenza A outbreaks in birds not only impact animal production, but also give rise to a risk in food caused by viral contamination of poultry products in the food supply chain. Distinctions in Avian Influenza outbreaks between strains H5N1 and H7N9 indicate that early detection of the AI virus in poultry is crucial for the effective warning and control of AI to ensure food safety. Therefore, the establishment of a poultry surveillance system for food safety by early detection is urgent and critical [6].

Global human outbreaks of swine influenza A(H1N1) are not as prevalent as human outbreaks related to Avian influenza, but have been more dramatic in outcome. Notably the influenza pandemic in 1918 was caused by a strain of influenza A(H1N1). In June 2009, the WHO issued a pandemic alert concerning the spread of an influenza A(H1N1) virus, originally characterized in April 2009 in human patients in California and Texas, USA, and in patients from Mexico, which were likely closer to the original jump from the porcine reservoir [7]. This strain showed distinctive genetic characteristics, with a main mutation in the gene coding for hemagglutinin (HA). The remarkable feature of A/(H1N1)pdm09, compared with seasonal strains, is its high fatality rate and its higher incidence among younger people [8].

Conventional methods usually applied for the purpose of AI diagnosis face some practical challenges in animal production chains. To establish a comprehensive poultry surveillance program throughout the poultry supply chain systematic approaches and integrated methods are needed at every stage of this chain to limit AI outbreaks in animals and prevent AI outbreaks in humans. It should be noted that the novel application of close to real-time characterization of influenza virus strains using next generation sequencing is a very promising development in this area [9].

2.2 The Future of One Health Food Safety

Future achievements in food safety, public health and welfare will largely be based on how well politicians, researchers, industry, national agencies and other stakeholders manage to collaborate using the One Health approach. Data on occurrence and disease burden from foodborne hazards combined with knowledge of source attribution are crucial in assessing costs and benefits of control measures. Food safety resources should be allocated where they contribute most to One Health benefits. Without knowledge of the incidence and burden of disease associated with particular pathogen/food commodity combinations, prioritization of foodborne hazards for mitigation action is difficult [10].

The three most relevant international organizations in this area (WHO, FAO, OIE) have recognized that combating zoonoses is best achieved via a One Health approach, as stated in their seminal paper ‘A Tripartite Concept Note’ [11] - (OIE: World Organization for Animal Health). Given the impact that zoonotic diseases are recognized to have in socio-economical terms, a One Health vision is also endorsed by the World Bank (WB) and the United Nations Children’s Fund (UNICEF) [12].

While the groups of zoonotic diseases mentioned above are very different, they are all most efficiently prevented by a One Health approach which considers the full farm-to-fork chain, and is based on surveillance data covering the full food production chain, and linked to public health human disease data in the end. Such preventive and holistic approaches may reduce both the disease burden to human health and the economic burden to developing economies, and therefore represent a significant potential for improvement as seen in a One Health perspective.

A number of food related chemical hazards are shared by animals and people either directly, through food or through the environment, i.e. should be covered within the One Health framework. Corn, for example, is a shared food ingredient and can be a source of aflatoxin poisoning for both people and animals if contaminated. The recent melamine poisoning of pets in North America and children in China has highlighted the need for joint one health investigations. In the melamine poisoning outbreaks, nephrotoxicity was observed in pets in 2007 and subsequently in Chinese infants and children in 2008 [13]. Chemical food contamination is a major cross-cutting issue, pesticides and other chemicals are often used in food production, sometimes inappropriately, providing opportunities for residues at dangerous levels in food products. The question of the exorbitant use of antimicrobials in animals will be dealt with in Sect. 4. The subject of animals as sentinels of environmental and ecosystem health has been discussed by the toxicology community for over 30 years [13]. In major contamination events the entire ecosystem, including people, are often affected by the pollution. Therefore, One Health monitoring and surveillance systems should clearly include chemical hazards.

The use of pesticides can protect crops and prevent post-harvest losses, thus contributing to food security. The development of pesticides was fundamental to the Green Revolution and transformation of modern agriculture. More recently evidence of the serious impacts on the environment has emerged. Pesticide misuse and pesticides as water pollutants are increasingly serious global challenges resulting in heavy environmental pollution and most likely significant health risks for humans [14]. Pesticide monitoring data from EU countries are reported to EFSA, and typically used to evaluate the level of samples where the Maximum Residue Level (MRL) is exceeded. In the period 2013–2015 (for a total of 28,912 conventional and 1940 organic food samples the MRL exceedance rate for conventional and organic food amounted to 1.2% and 0.2% [15]. It is important to note that the MRLs do not directly reflect human health effect limits, instead they reflect the lowest level manageable to maintain pest-killing effect under present agricultural methods.

Notably, the most significant human health risk related to pesticide use in agriculture is pesticide poisoning. In the USA the Environmental Protection Agency (EPA) conducts poisoning surveillance to determine whether labeling is effective. Based on this EPA can require that interventions be instituted that involve changing pesticide use practices, and the appropriate interventions for these cases include enhanced education and enforcement [16]. On average, in Germany almost 200 annual cases of pesticide poisoning are hospitalized, and approx. 5% of such poisoning cases are reported to be fatal [17]. Most likely such figures are significantly higher in developing countries with less efficient health systems. Data from monitoring systems focused on pesticides and other chemicals in food will in the future also be used for risk assessment of combined exposure to multiple chemicals (“chemical mixtures”). Typically, risk assessment of multiple chemicals is conducted using a tiered approach for exposure assessment, hazard assessment and risk characterization [18], an approach that clearly needs update as new data from animal experiments show the potential for additive effect of such chemicals [19].

One Health formulates both the need for, and the benefit of, cross-sectoral collaboration. Here we will focus on the human health risk related to hazards present both in plants grown for food and food animals and food derived from these animals, and typically transmitted to humans through food. Some diseases have global epidemic – or pandemic – potential, resulting in dramatic action from international organizations and national agricultural- and health authorities in most countries, for instance as was the case with avian influenza. Other diseases relate to the industrialized food production chain and have been – in some settings – dealt with efficiently through farm-to-fork preventive action in the animal sector, e.g. Salmonella, or in the plant production sector, e.g. DDT.

2.3 International, Regional and National Examples of Food Safety Surveillance and Risk Assessment

This section will list a number of – in no way fully representative – examples of existing food safety systems with a focus on national and regional surveillance systems providing data for food contamination and foodborne disease, resulting from microbiological as well as chemical hazards. The section will also briefly describe examples of the now accepted methodology for science-based decision support based on such data: risk assessment, within the risk analysis framework, initially defined by WHO and FAO [20].

The examples described here are in no way intended to provide a full picture of developments in this area, however they do represent some of the novel developments in the food safety area that have contributed to significantly revise food control, food safety and foodborne disease prevention over the latest decades.

2.3.1 FAO/WHO Food Safety Expert Bodies

FAO and WHO work together to provide scientific guidance on chemical as well as microbiological hazards and the human health risk they cause. The first FAO/WHO Expert Committee was the Joint Expert Committee on Food Additives (JECFA) created in 1955 to study the impact of food additives, including veterinary drugs, chemicals and toxins on human health. As an independent group, JECFA advises the FAO/WHO Codex Alimentarius Commission and other Codex bodies on current and emerging issues in this area. In 1963 an additional group dealing with chemical safety assessment was created: the Joint FAO/WHO Meeting on Pesticide Residues (JMPR), advising the Codex Alimentarius Commission on maximum residue levels for pesticides and environmental contaminants in food products. More recently the problems of microbiological contaminants in food have resulted in the creation of a Joint FAO/WHO Expert meeting on Microbiological Risk Assessment (JEMRA). This body has since 2000, in collaboration with the Codex Committee on Food Hygiene, initiated risk assessment work on a number of important foodborne pathogens (e.g. Salmonella, Campylobacter, Listeria and Vibrio). All three expert bodies operates under the generic Risk Analysis framework focusing on a formalized and standardized risk assessment process (see Fig. 1).

Fig. 1
A block diagram begins with hazard identification leads to hazard characterization, exposure assessment. Each further leads to risk characterization.

Components of a microbiological Risk Assessment

The Codex Alimentarius Commission (Codex) as a subsidiary body of FAO and WHO is one of the most important and successful multilateral institutional mechanisms for regulatory harmonization and standards cooperation in the global system. In many ways, the success of Codex as the multilateral institutional standard-setting mechanism for food safety is the result of well-defined normative agreements, and well segmented and sustained work on strategic market and regulatory policy issues in the global food system. It is likely that the clear definition – and separation – of the scientific advice provided by the FAO/WHO Expert groups and the management decisions suggested by the Codex Committees has contributed significantly to this success. Through the long-term focus on social, economic and scientific aspects of food safety regulation, the institutional legitimacy of the Codex Alimentarius Commission has grown as globalization of the agri-food industries and food systems accelerates. Consequently, the ability of FAO, WHO and Codex to effectively mobilize national governments, industry and civil society in support of food safety regulatory standards harmonization reinforces the need for increased multilateral cooperation in this area.

2.3.2 WHO/FAO International Food Safety Authorities Network (INFOSAN)

It has been recognized for some years that the increased globalization of food trade also increases the risk of contaminated food spreading quickly around the globe. In 2004 WHO created the INFOSAN network to enable WHO to assist Member States in managing food safety risks and ensure rapid sharing of information during food safety emergencies, later INFOSAN became a joint Network with FAO. INFOSAN also facilitates the sharing of experiences and tested solutions in and between countries in order to optimize future interventions to protect the health of consumers. National authorities of 186 Member States are part of the network [21].

INFOSAN Member States typically have an Emergency Contact Point and several Focal Points. Members are expected to respond to requests for information and take the initiative to share and disseminate food safety information of potential international relevance. In the 2016/17 biennium INFOSAN has been operational during 84 food safety events. The level of engagement by the INFOSAN Secretariat relates to the countries involved, the severity of the public health impact, and the duration of the event. In many cases, the INFOSAN Secretariat will request information from INFOSAN Emergency Contact Points following the receipt of information about a food safety event of potential international concern. During complex events involving multiple countries, the INFOSAN Secretariat actively obtains and disseminates information to and from INFOSAN members regarding food safety events of international concern. INFOSAN is considered to be functioning under the umbrella of the WHO International Health Regulation (IHR) which stipulates that any country experiencing a ‘public health event of potential international concern’ (PHEIC) must inform WHO – and thereby the world – about this event. INFOSAN does not have regulatory oversight, as per the nature of the United Nations systems precludes such action.

The total number of events treated under INFOSAN pales in comparison with systems that are part of food legislation (e.g. EU Rapid Alert System for Food and Feed: RASFF). Nevertheless, the nature of events reported under INFOSAN can give us an idea of global trends. Table 1 lists the events as recorded per hazard type (biological, chemical etc.), clearly showing the biological (microbiological) events occurs most frequently. Table 2 describes events caused by (micro)biological hazards with Salmonella and Listeria – as expected – causing the highest number of internationally important events. It should be noted that the recognition of events under the INFOSAN Network in no way can be said to give a scientifically valid estimation of the public health importance of different microbiological (or chemical) hazards.

Table 1 International food safety events acted upon by INFOSAN by hazard category, 2013–2017 [22]
Table 2 International food safety events, acted upon by INFOSAN, involving biological hazards, 2013–2017 [22]

2.3.3 European Union

The routine monitoring of reported data related to food contamination events in all EU Member States is available for Member States through the EU Rapid Alert System on Food and Feed (RASFF) through the database maintained by the European Commission (EC) [23]. This source of information, largely based on surveillance and inspection programs driven by food safety contamination events reported by Member States, strongly depends on the nature of national monitoring and control programs. The filtering of notifications to be sent to the RASFF at a national level is only partially standardized and an unknown proportion of food incidents occurring at a national level never arrive into the system [24]. It is evident that the very high number of alerts recorded (See Table 3 and Fig. 2, [23]) means that many events will not really be treated further by most Member States in any major way. However, the system and the relatively new set-up which enables open sharing of data also outside regulatory agencies (RASFF Consumer Portal) clearly contributes to transparency and open risk communication across borders. It should be noted that other regions are trying to set up mirror RASFF systems, notably there is now an ASEAN-RASFF system open for alert sharing in the ASEAN region – although not yet effective (http://arasff.net/). A new plan of action for A-RASFF has been adopted in October 2018 [25].

Table 3 Evolution of the number of notifications - by notification classification: Original notifications and Follow-up from 2011 to 2017
Fig. 2
A grouped bar graph plots R A S F F notifications for original and follow-up notifications of alert, border rejection, information for attention, and information for follow-up from the years 2011 to 2017. Follow-up alert has maximum notifications and original information for follow-up has least notification.

Evolution of the number of notifications - by notification classification. Original notifications and follow-up–from 2011 to 2017. (Data are from RASFF: EU Rapid Alert System for Food and Feeds [23])

All member states within the European Union (EU) are obliged to collect data on occurrence of zoonoses, zoonotic agents, antimicrobial resistance, animal populations, and foodborne outbreaks, according to Directive 2003/99/EC. These reports enable evaluation of trends and sources of zoonotic agents, antimicrobial resistance and foodborne outbreaks within the EU [26]. It is noteworthy that these reports have been effective in directing Member State efforts in the area, for example specific efforts to mitigate Salmonella risk in EU countries seem to have been effective as documented in an almost 50% reduction in human Salmonella cases in the EU over a short 5 year period (2004–2009) [27]. At the same time, the prevalence of Salmonella in poultry decreased significantly, especially in laying hen flocks, presumably this reduction is likely to be the main reason for the decline of Salmonella cases in humans, since eggs are considered the most important source of human infections in the EU. Notably some EU Member States have even succeeded in eradicating Salmonella in egg-laying hens and thereby in nationally produced eggs for the market. The most convincing documentation for this comes from Denmark where in 2015 a record low number of foodborne salmonella cases were registered with no cases attributed to Danish eggs for the first time in the almost 30-year history of the salmonella source account in that country [28].

The European food safety system underwent a very dramatic revision following several food scandals in the late 1990s, most notably the ‘Mad Cows Disease’ scandal. Regulation (EC) No 178/2002 of 2002 laid down the general principles and requirements of a European food law, establishing the European Food Safety Authority (EFSA) and all procedures in matters of food safety for all Member States. It is noteworthy that while food safety provisions are EU authority, health issues are typically under Member State authority. This sometimes causes problems at EU level when food (EFSA) and health (ECDC) data are collated. It should be said, however, that the collaboration between EFSA and ECDC has improved markedly over the latest years.

EFSA – in spite of its name – is not an Authority in the typical sense of that word, the regulatory entity in the EU system is the EU Commission. EFSA responsibility is risk assessment and risk communication, which naturally includes overseeing monitoring and surveillance systems in collaboration with Member States. EFSA therefore has initiated a number of Expert Panels. As an example of the type of work performed by such panels the EFSA Panel on Biological Hazards (BIOHAZ) has published two EU-wide farm-to-fork quantitative microbiological risk assessments (QMRA), with regard to Salmonella in slaughter and breeder pigs and Campylobacter in broilers. The Scientific Opinion on a QMRA of Salmonella in pigs represented a major step forward in terms of modelling from farm to consumption as it took into account the variability between and within EU Member States. This QMRA model was developed to estimate the prevalence of infection and contamination and the microbial load from the farm to the point of consumption (exposure) and then estimating the probability of infection. It was also used to investigate the effect of interventions to control Salmonella in pigs at different points of the food chain and resulted in a hierarchy of suggested on– farm and slaughterhouse control measures, with estimates of the reduction of human cases they would result in [29]. To model the effect of interventions from farm to fork on the incidence of human campylobacteriosis, a QMRA model was developed for Campylobacter in broiler meat. Reductions to the public health risk of campylobacteriosis could be achieved through a variety of interventions, both in primary production or at the slaughterhouse, with different impacts. Reductions of public health risk using targets at primary production or microbiological criteria were also estimated through modelling using additional models.

In general, QMRA of food-borne pathogens at European level has proven a useful and efficient tool to enable risk managers to evaluate the feasibility and the cost-benefit ratio of introducing control measures and targets to further protect public health of consumers [29].

Since its creation in 2002, EFSA has produced risk assessments for more than 4000 substances in over 1600 scientific opinions, statements and conclusions through the work of its scientists. For individual substances, a summary of human health and – depending on the relevant legislation and intended uses – animal health and ecological hazard assessments has been collected and structured into EFSA’s chemical hazards database: OpenFoodTox., [30]. This database provides open source data for substance characterization and links to EFSA’s related outputs, background European legislation, and a summary of the critical toxicological endpoints and reference values. OpenFoodTox is a tool and source of information for scientific advisory bodies and stakeholders with an interest in chemical risk assessment. Summary data sheets for individual substances can be downloaded.

2.3.4 USA

The US Food and Drug Administration (FDA) uses risk analysis, a concept and framework fostered by the WHO and the FAO in the mid-1990s, to ensure that regulatory decisions about foods are science-based and transparent. The FDA’s Center for Food Safety and Applied Nutrition (CFSAN) applies the concept of risk analysis using tools aimed at presenting new possibilities for detecting and mitigating risks to the food supply. For example, CFSAN and NASA (the National Aeronautics and Space Administration) are conducting a pilot project that uses geospatial analysis to recognize patterns of contamination in crops, forecasting high potential for contamination events in specific regions, at specific times and under various weather conditions [31].

The US Food Safety System in general represents a case of shared government responsibilities. Food safety and quality in the United States is governed by 30 federal laws and regulations administered by 15 federal agencies. The three main agencies are: The Food and Drug Administration (FDA), the US Department of Agriculture (USDA), dividing between them food and food production according to food groups, and the Centers for Diseases Control and Prevention (CDC), mainly responsible for investigating localized and nationwide outbreaks of foodborne illnesses. In many cases, the food safety functions of the FDA and USDA overlap; particularly inspection/enforcement, training, research, and rulemaking, for both domestic and imported food. Nevertheless, the system presents a generally successful example of how integration (and shared use) of data can be possible across sectors and governmental entities.

USA was one of the first – if not the first – country to implement (in 1996) an active surveillance system for foodborne diseases. The Foodborne Diseases Active Surveillance Network (FoodNet) tracks foodborne illnesses, generating information used to guide and monitor food safety policy and prevention efforts. FoodNet estimates numbers of foodborne illnesses, monitors changes in incidence of specific illnesses over time, and attributes illnesses to specific sources and settings. The system functions as a collaborative program of the CDC, 10 state health departments, USDA and FDA. FoodNet conducts population-based active surveillance for laboratory-confirmed infections caused by seven bacterial pathogens (Campylobacter, Listeria monocytogenes, Salmonella, Shiga toxin-producing Escherichia coli [STEC], Shigella, Vibrio, and Yersinia), two parasitic pathogens (Cyclospora and Cryptosporidium), and one syndrome: hemolytic uremic syndrome (HUS), typically caused by STEC. The FoodNet surveillance area includes approximately 15% of the population of the United States of America [32].

In an effort to re-conceptualize the US strategic food safety system it has been realized that exchanges of knowledge and information about foodborne hazards facilitated by new communication technologies could drive improved coordination with more efficient regulatory intervention. However, across the farm-to-table spectrum, many critical points are beyond the reach of rules and standards [33]. The dominant logic of traditional approaches using control rather than management may result in less than desirable outcomes. The US system can be said to be one of the primary national regulators promoting a farm-to-table, science-based management framework all the way back from an original first description in the ‘Clinton Farm-to-Table Plan’ [34].

The US system has been shown to be able to detect and respond to new developments in the food safety landscape. An increasing number of microbial foodborne illnesses are associated with fresh fruits and vegetables. An analysis of foodborne outbreaks in the USA found that 12% of outbreaks and 20% of outbreak-related illnesses were associated with produce [35]. A modern risk-based food safety system takes a farm-to-fork preventative approach to food safety and relies on the proactive collection and analysis of data to better understand potential hazards and risk factors, to design and evaluate interventions, and to prioritize prevention efforts. Such a system focuses resources at the points in the food system with the likelihood of having greatest benefit to public health [36].

PulseNet USA, a national molecular subtyping network for foodborne disease surveillance was initiated in the United States in 1996 as a critical early warning system for foodborne disease outbreaks. The system was based on a at that time revolutionary (relatively) new typing methodology: PFGE (Pulsed Field Gel Eletrophoresis) enabling rapid genomic comparison between human and food/animal foodborne disease related isolates. The PulseNet network is now being replicated in different ways in Canada, Europe, the Asia Pacific region, and Latin America [37]. These independent networks work together in PulseNet International allowing public health officials and laboratorians to share molecular epidemiologic information in real-time and enabling rapid recognition and investigation of multi-national foodborne disease outbreaks.

A new PulseNet International vision is focused on the standardised use of whole genome sequencing (WGS) to identify and subtype food-borne bacterial pathogens worldwide, replacing traditional methods. Focused on real-time surveillance, such standardized subtyping will deliver sufficiently high resolution and epidemiological concordance. Ideally, WGS data collected for surveillance purposes should be publicly available in real time, not only for disease surveillance and outbreak purposes but also to answer scientific questions pertaining to source attribution, antimicrobial resistance, transmission patterns, etc. [38].

2.3.5 Denmark

The national system for food safety in Denmark has been organized (since 2007) with a clear separation between risk assessment (hosted in a University Institute) and risk management (hosted by the Gov. Food Safety Regulator). Thus, the administrative responsibilities (rules, control etc.) lie with the Danish Veterinary and Food Administration while the National Food Institute, Technical University of Denmark, is responsible for the scientific assessment of risks and the research-based assessment of monitoring data. This separation enables independent scientific description of problems and possible solutions, offering a transparent and seemingly efficient system. A cornerstone in providing research-based scientific advice is to have people involved who actually do research in relevant areas, i.e. University scientists. The National Food Institute conducts research in microbiological and chemical risk assessment, but also in food production and nutrition. The Institute thus adopts a holistic approach to food, including knowledge about production forms as well as the positive and negative aspects of our food. The basic research conducted by the National Food Institute is recognized internationally, and the Institute operates a number of EU reference laboratories as well as WHO collaborating centers [39].

The Danish Zoonosis Centre was created in 1995 to combine data on zoonotic pathogens between the animal, food and health sectors. It is therefore the first example of a ‘one health’ surveillance system – before the term was actually invented (in 2008). The Centre publishes annual reports enabling science-based policy decisions in this area. Similar zoonosis reports are now produced in a number of other European countries. The Danish Zoonosis Report 2017 shows that Campylobacter is the most common foodborne illness in Denmark, using integrated data the report shows that cattle may be a source of Campylobacter infection, leading to changes in the new Danish Action Plan against Campylobacter 2018–2021 [40]. In 2017, the Salmonella source account, which links the number of human Salmonella infections to specific food items and animals reservoirs, by modelling the distribution of serovars, was for the first time based on results from WGS. Domestic and imported pork were estimated to be the sources most commonly associated with human salmonellosis Burden of disease studies can be used to compare severity of foodborne pathogens, for instance showing that even though the number of cases with listeriosis is lower than e.g. salmonellosis, the burden of disease is high due to the serious nature of the disease (12 deaths reported in Denmark in 2017 from listeriosis). The burden of disease study on Norovirus estimated approximately 185,060 cases of Norovirus and 26 deaths in Denmark in 2017 [40].

The research-based risk assessment conducted by the National Food Institute can be divided into chemical and microbiological risk assessment with the chemical part covering both population exposure estimation and an assessment of potential effects in humans. Risk assessment is the scientific part of risk analysis which consists of a further two elements: risk management and risk communication. Risk assessment includes hazard identification and characterization and exposure assessment, and based on these aspects, the risk is characterized (See Fig. 1).

Risk assessments of chemicals are generally based on a comparison of human exposure to a NOAEL (No Observed Adverse Effect Level) for the chemical, i.e. the highest dose of chemical causing no adverse effects in laboratory animals. This is done for one chemical at a time. However, humans are exposed to many different chemicals on a daily level. In vitro studies and studies in experimental animals show that for e.g. endocrine disrupting chemicals exposure to several chemicals can induce effects, although the doses for the single chemicals are below or around NOAEL. This implies that risk assessments of single chemicals in isolation most likely underestimates the combined risk for humans. New knowledge related to risk assessment of chemical cocktails in food suggests that we need additional data to elucidate combined exposure to chemicals [41].

2.3.6 Canada

The Canadian Food Inspection Agency (CFIA) uses ‘Ranked Risk Assessment’ (RRA) to prioritize chemical hazards for inclusion in monitoring programmes or method development projects based on their relative risk. The relative risk is calculated for a chemical by scoring toxicity and exposure in the ‘risk model scoring system’ of the Risk Priority Compound List (RPCL). The ranking may be refined by the data generated by the sampling and testing programs. The two principal sampling and testing programmes are the National Chemical Residue Monitoring Program (NCRMP) and the Food Safety Action Plan (FSAP). The NCRMP sampling plans focus on the analysis of products for residues of veterinary drugs, pesticides, environmental contaminants, mycotoxins, and metals. FSAP surveys focus on emerging chemical hazards associated with specific foods or geographical regions for which applicable maximum residue limits (MRLs) are not set. Follow-up actions vary according to the magnitude of the health risk, all with the objective of preventing any repeat occurrence to minimize consumer exposure to a product representing a potential risk to human health [42].

2.3.7 Australia and New Zealand

Food Standards Australia New Zealand (FSANZ) is an independent statutory agency established by the Food Standards Australia New Zealand Act 1991 (FSANZ Act). FSANZ is part of the Australian Government’s Health portfolio. FSANZ, along with other government agencies in Australia and New Zealand, monitors the food supply to ensure it is safe. FSANZ routinely conducts targeted surveys and Total Diet Studies to collect analytical data on the levels of chemicals, microbiological contaminants and nutrients in food.

The Communicable Disease Network Australia and OzFoodNet monitor incidents and outbreaks of foodborne disease which can lead to the detection of an unsafe food product or unsafe food practice [43]. Microbial contamination may take place at prefarming, farming or post-farming stages of the food supply chain. Campylobacter, Salmonella, Listeria monocytogenes, Escherichia coli O157:H7 and non-O157:H7 STEC E. coli are the most common pathogenic bacteria associated with food safety issues in the food supply chain [44]. Efficient process controls and effective food safety management systems are vital elements to reduce microbial contamination and improve food security.

2.3.8 The Netherlands

The Dutch National Institute for Public Health and the Environment (RIVM) collects and collates knowledge and information from various sources, both national and international, placing it at the disposal of policy-makers, researchers, regulatory authorities and the general public. Each year, RIVM produces numerous reports on all aspects of public health, nutrition and diet, health care, disaster management, nature and the environment. The RIVM covers three domains with specific knowledge and expertise: Infectious Diseases and Vaccinology (Centre for Infectious Disease Control), Environment and Safety (including environmental incident service), Public Health and Health Services (including food and food safety) [45].

Microorganisms may enter the food chain for instance during production or during home preparation. Foods may also contain chemical contaminants, some of which can be harmful to health. RIVM develops models to determine food safety, and maintains databases of relevant information. For example, determining the concentration at which a chemical substance will pose a risk to health, and how much of that substance a person can safely ingest. In the field of microbial food safety RIVM has developed, together with international partners, a risk assessment tool called Quantitative Microbiological Risk Assessment (QMRA). This tool contains food chain models (‘farm-to-fork’) in which the prevalence and number of pathogens are followed. The Dutch government, together with other national authorities in Europe, is responsible for establishing, monitoring and enforcing laws and regulations to that end. RIVM advises the government in these matters, at the national and international levels. The Netherlands Food and Consumer Product Authority (NVWA) is responsible for supervision and enforcement in the Netherlands.

Dutch research is increasingly being undertaken in the international context for organisations such as EFSA and WHO/FAO. RIVM also researches food allergens, seeking to identify substances which cause an allergic reaction and the quantity of the substance which is likely to do so. Based on research findings, RIVM advises various clients. They include the Ministry of Health, Welfare and Sport (VWS) and other Ministries, the Netherlands Food and Consumer Product Safety Authority (NVWA), the Board for the Authorisation of Plant Protection Products and Biocides (Ctgb), the Veterinary Medicinal Products Unit (BD), the European Food Safety Authority (EFSA), WHO and FAO [45]. RIVM hosts the World Health Organization Collaborating Centres on Chemical Food Safety and of Risk Assessment of Pathogens in Food and Water.

Pesticide risk assessment is hampered by worst-case assumptions leading to overly pessimistic assessments. This is mostly based in deterministic risk assessment models, that have been used for chemicals in food for more than 50 years. In addition, cumulative health effects of similar pesticides are often not taken into account in these assessments. The European research project ACROPOLIS has attempted to developed stochastic modelling in this area, something that has been done for microbiologicl risk assessments for more than 20 years [46]. These models are appropriate for both acute and chronic exposure assessments of single compounds and of multiple compounds in cumulative assessment groups. The software system MCRA (Monte Carlo Risk Assessment) is available for stakeholders in pesticide risk assessment at http://mcra.rivm.nl/. The emphasis is on cumulative assessments, presenting two contrasting approaches, sample-based and compound-based. Examples are given of model and software validation of acute and chronic assessments, using both simulated data and comparisons and not surprisingly, additional data on agricultural use of pesticides may give more realistic risk assessments. This program is an independent research tool from the Dutch government, developed by the Wageningen University, in close cooperation with RIVM.

RIKILT-Institute of Food Safety is an independent non-profit institute conducting research on the detection and identification of contaminants in food and feed, The institute contributes to the monitoring of production chains, the quality of agricultural products, and the knowledge of health-protecting substances in food. It carries out legislative and policy-supporting tasks for the Dutch government and international bodies, including the European Commission and EFSA [47].

TNO (Netherlands Organisation for Applied Scientific Research) is providing independent advice in safety assessment (including food safety) and risk management. Developing methodology that enables manufacturers and public-sector bodies to quickly and accurately assess the microbiological and toxicological safety of complex products. Also developing methods for predicting the allergenicity of proteins and peptides, and developing instruments for the early detection of public health risks and potential food incidents. TNO is working on internationally recognised testing methods that speed up product and policy development and enable more decisive responses to potential food incidents. TNO has been working for more than 20 years on investigating the effect of different foodstuffs on the enteric environment, and thereby health, using in vitro gut models [48].

2.3.9 China

Like most other major economies, China has been updating and changing its food safety regulatory system in major ways over the last 20 years. Food production in China is now one of the major drivers of economic development, and the identification of food safety as a national priority, combined with a number of major food safety scandals, has driven modernization of the food safety legislative framework. One of the important, new developments has been the creation of China’s National Center for Food Safety Risk Assessment (CFSA). The National Food Safety Standards (NFSS) Framework was established, benchmarked on international best practices and on the guidance of the Codex Alimentarius Commission (CAC), with a clear direction to base food safety standard setting on risk analysis principles and in particular on risk assessments based on Chinese data [49].

An important component of the food safety risk analysis framework in China is monitoring and surveillance based in the new Food Safety Law of the People’s Republic of China in 2009. At present, the system is comprised of four networks plus dietary exposure monitoring. The four networks include the foodborne disease surveillance network, the biological hazards (bacteria, virus and parasites) monitoring in foods network, chemical hazards monitoring in foods network and the microbial PFGE profile network. The system now covers all 31 provinces, major municipalities and autonomous regions in Mainland China and is carried out for the national food and exposure monitoring and foodborne disease surveillance and investigation. The China National Center for Food Safety Risk Assessment has been assigned overall responsibility for foodborne disease surveillance and dietary exposure monitoring through periodic national Total Diet Studies [50].

The Chinese microbiological food safety surveillance system collects data regarding food contamination by foodborne microorganisms, providing relevant data for food safety supervision, risk assessment, and standard-setting [51].

2.3.10 Lebanon

While China is still formally considered a developing country, it should be realized that a significant number of other developing countries are in a situation where the regulatory food safety capacity is only now being built. Lebanon is but one of such countries. A risk-based food safety and quality governance based on international guidance is presently being developed in Lebanon [52].

The new Lebanese food safety law (2016) will result in the creation of a Lebanese food safety authority (LFSA) which will be developing a food safety governance system in Lebanon in accordance with the FAO/WHO risk analysis framework and the World Trade Organization (WTO) Sanitary and Phytosanitary (SPS) Agreement. Lebanese officials have used experience from regulatory and institutional food safety governance system developed in USA, EU, Canada and France as relevant models. There is a recognized need to strengthen the Lebanese infrastructure capacity at the institutional and stakeholders level through harmonization of Risk Assessment (RA) and Risk Management (RM) processes. It was recognized that food safety systems in the model countries listed did not always correspond to the scientific approach where RM and RA should be functionally and institutionally separated [52].

3 AMR in the Context of Food Safety Surveillance and Risk Mitigation

The inventor of the way we still identify micro-organisms, Louis Pasteur, stated in 1878: “It is a terrifying thought that life is at the mercy of the multiplication of these minute bodies; it is a consoling hope that science will not always remain powerless before such enemies.” And indeed science did provide fantastic solutions to combat microorganisms: we now have antimicrobials with the ability to kill even some of the most dangerous of these “minute bodies”. But unfortunately, microorganisms have also found ways to fight back. In 2013 the chief medical officer for England, Sally C Davies, wrote in her book ‘The Drugs Don’t work’: “We are now at a crossroads … as our use of these valuable drugs is not only becoming threatened by the spectre of resistance among the bugs they are used to treat, but also as we recognise that their injudicious use can cause harm in its own right”.

The spread of antimicrobial-resistant bacteria poses a major threat not only to our ability to treat and prevent specific diseases, but to provide medical care across a range of emergency events. Therefore the occurrence of and rapid increase in the level of antimicrobial resistance (AMR) in microorganisms, including human and animal pathogens should be considered a significant health security threat. That the need for action to contain this global threat is both immediate and growing was recognized by global leaders meeting at the General Assembly of the United Nations, which in a 2016 declaration recognized resistance to antibiotics (antibiotics are antimicrobials produced by microorganisms) as the “greatest and most urgent global risk,” [53].

Globally, more than half of all antimicrobials (up to 85%) are not used to treat humans but to help in animal production. While the EU has banned the use of antimicrobials as growth promoters in animals, in all other parts of the world such use (i.e. use that is not linked to disease) likely constitutes at least half of all animal use – the rest relating to actual treatment (or prophylaxis) in animals. Any use of antimicrobials in animals can lead to the development of antimicrobial resistance (AMR) in their bacteria, and all animal bacteria will potentially end up in humans – mainly through our food but also through other routes, e.g. direct contact.

While the issue of the use of antimicrobials in animal food production systems has been acknowledged as a potential serious problem for at least 20 years [54] more recent documentation of increased serious AMR stemming from animal production are now emerging, causing serious concern [55, 56]. Likewise it has been obvious for some time, that irresponsible use of antimicrobials as animal growth promoters (AGP) was contributing to the problem [57], and that experience from different national attempts to control the problem could suggest directions towards successful mitigation [58]. One such major regulatory milestone was the EU ban of use of Antimicrobials for animal growth promotion in 2006.

Especially worrisome is the emergence of resistance against antimicrobials that are considered critically important in human medicine, and in multidrug resistant (MDR) infections [59]. During recent decades the animal use of antimicrobials, particularly as AGPs, has led to alarming levels of AMR in many countries. Conflicts of interests, values and risks between agriculture, health and commercial stakeholders seem to have complicated the introduction of efficient interventions to mitigate this increasing risk [58]. In addition, unintended economic incentives of veterinarians profiting from their own subscriptions has most likely stimulated antimicrobial use in animals in general, such incentives now banned in all Scandinavian countries.

It should be recognized that the use of AGPs are not only relevant in land-animals. It is well documented that the exposure of fish pathogens and aquatic bacteria to antimicrobials drives the development of drug resistance, and there seems to be a causal relationship between the use of specific antimicrobials in aquaculture and an increase in AMR prevalence [60]. Additionally, other studies suggest that AMR in aquaculture environments could contribute to the AMR of human pathogens [61].

3.1 Surveillance Systems for Antimicrobial Resistance and Antimicrobial Use

The need for antimicrobial resistance surveillance has been discussed internationally for at least 25 years. While the French system for surveillance of AMR in certain animal species was initiated already in 1982, the first two national, integrated (animal/human) systems to be effectuated were: DANMAP (Danish Programme for surveillance of antimicrobial consumption and resistance in bacteria from animals, food and humans) and US NARMS (The National Antimicrobial Resistance Monitoring System). As the titles reveal only the Danish system included data on antimicrobial use (both veterinary and human). Much later (2017) experience from the collaborative efforts of the European Antimicrobial Resistance Surveillance System (EARSS) and the European Surveillance of Antimicrobial Consumption program (ESAC) have clearly demonstrated that the integrated monitoring of resistance, use, and costs can prove a crucial factor driving political commitment to successful resistance containment campaigns.

3.1.1 WHO

The GLASS data-sharing platform was initiated in 2015 following the adoption of the Global action plan on antimicrobial resistance by the 68th World Health Assembly that year. This reflects the global consensus that AMR poses a profound threat to human health and that enhanced global surveillance and research is needed to strengthen the evidence base and support AMR risk mitigation. GLASS was developed to facilitate and encourage a standardized approach to AMR surveillance globally, but unfortunately it is not integrated across disciplines and it does not support data on human use. The first GLASS plan suggests that at a later stage it will allow progressive incorporation of information from other surveillance systems related to AMR in humans, such as for foodborne AMR, as well as monitoring of antimicrobial use [62]. For some time – basically since 2000 – WHO has actually promoted integrated surveillance – at least for foodborne pathogens – and the WHO AGISAR group (Advisory Group on Integrated Surveillance of Antimicrobial Resistance) has produced significant guidance over the years to that effect [63].

3.1.2 EU

The European Antimicrobial Resistance Surveillance System (EARSS) was established in 1998 and in 2010 it was transferred to the European Centre for Disease Prevention and Control (ECDC) as the European Antimicrobial Resistance Surveillance Network (EARS-Net). It provides public access to descriptive data (maps, graphs and tables) that are available through the ECDC Surveillance Atlas of Infectious Diseases. More detailed analyses are presented in annual reports and scientific publications. The objectives of EARS-Net are to collect comparable, representative and accurate AMR data, encourage the implementation, maintenance and improvement of national AMR surveillance programmes. It is noteworthy that the general picture when comparing the EU/EEA countries there seems to be a very clear trend for higher AMR prevalence in the south and lower AMR prevalence in the north, most likely reflecting the efficiency of risk mitigation policies in these regions (see Fig. 3).

Fig. 3
A stacked horizontal bar graph compares the high prevalence of A M R in the south and the low prevalence of A M R in the north of the E U forward slash E E A countries, in percent.

Frequency distribution of Escherichia coli isolates completely susceptible and resistant to 1–11 antimicrobials in broilers, 30 EU/EEA Member States, 2016. (From ECDC/EFSA [56])

Fortunately the European data is now also presented in an integrated report with input/data from ECDC, EFSA and EMA (European Medicines Agency) [64], which covers both AMR resistance and AM use in food-producing animals and humans.

3.1.3 Denmark

In 1995, Denmark was the first country to establish an integrated, systematic and continuous monitoring program of antimicrobial drug consumption and antimicrobial agent resistance in animals, food, and humans, the Danish Integrated Antimicrobial Resistance Monitoring and Research Program (DANMAP). Monitoring of antimicrobial drug resistance and a range of research activities related to DANMAP have contributed to restrictions or bans of use of antimicrobial growth promotors (AGP) in food animals in Denmark and other European Union countries. In fact Danish data were instrumental in driving EU policy and legislation towards the ban of the use of antimicrobial growth promotors in animal production [55].

In Denmark DANMAP data and analyses have been used to promote sustainable animal production practices where high productivity is reached without inappropriate use of antimicrobials. The key elements here are good animal husbandry practices that prevent disease, combined with commercial disincentives for AM use and a legal framework that regulates the use of antimicrobials in the animal sector as well as takes away the opportunity of veterinarians to make a profit from (prescribing and) selling antimicrobials. Indeed the success story of Danish pig production is instructive here, with an annual production of 20 million pigs before the ban of AGP and 30 million after the ban [57].

3.1.4 USA

The US National Antimicrobial Resistance Monitoring System for Enteric Bacteria (NARMS) was established in 1996. NARMS is a collaboration among state and local public health departments, CDC, the U.S. Food and Drug Administration (FDA), and the U.S. Department of Agriculture (USDA). NARMS uses an integrated “One Health” approach to monitor antimicrobial resistance in enteric bacteria from humans, retail meat, and food animals. NARMS data are not only essential for ensuring that antimicrobial drugs approved for food animals are used in ways that are safe for human health but they also help address broader food safety priorities.

NARMS surveillance, applied research studies, and outbreak isolate testing provide data on the emergence of drug-resistant enteric bacteria; genetic mechanisms underlying resistance; movement of bacterial populations among humans, food, and food animals; and sources and outcomes of resistant and susceptible infections. NARMS surveillance focuses on two major zoonotic bacterial causes of foodborne illness in the United States, nontyphoidal Salmonella and Campylobacter. Food animal and retail meat surveillance also include Enterococcus and Escherichia coli, common intestinal bacteria that can serve as reservoirs of resistance genes and indicators of selection pressures in Gram-positive and Gram-negative bacteria, respectively. In addition, CDC uses the NARMS human surveillance platform for monitoring resistance in E. coli O157, Vibrio, and the nonzoonotic enteric pathogens, Shigella and typhoidal Salmonella. NARMS data can be used to guide and evaluate the impact of science-based policies, regulatory actions, antimicrobial stewardship initiatives, and other public health efforts aimed at preserving drug effectiveness, improving patient outcomes, and preventing infections [65].

3.1.5 France

The French surveillance network for antimicrobial resistance in pathogenic bacteria of animal origin (RESAPATH) was set up in 1982 under the name of RESABO (BO for bovines). In 2000, it was expanded to pigs and poultry and in 2007, to other animal species, and now resides under the French Agency for Food, Environmental and Occupational Health Safety (ANSES). The surveillance system estimates AMR in animal pathogens and is also part of a recent intersectorial “One Health” national action plan against antimicrobial resistance in humans, animals and the environment adopted in 2016.

Data from RESAPATH have documented a decline or stabilisation in resistance for the vast majority of antimicrobials tested in animal pathogens 2006–2014, with the proportion of multi-resistant bacterial strains significantly reduced in all species. These results are consistent with the large reductions in exposure of animals to antimicrobials in France in recent years. However, resistance levels seem to have slightly increased between 2014 and 2016 for several animal species and antimicrobials [66].

3.1.6 Germany

In Germany data on the consumption of antimicrobials and the spread of antimicrobial resistance in human and veterinary medicine is recorded in the GERMAP report within the “One Health” approach with input from a large number of federal institutions, including the BfR (Risk Assessment) and Robert Koch Institute (Public Health) [67]. The National Reference Laboratory for Antibiotic Resistance within BfR is tasked with – under the framework of the Zoonoses Monitoring Directive (2003/99/EC): Antimicrobial resistance testing, collaboration in the analysis of infection chains, molecular characterization of antimicrobial resistance determinants and conduct of inter-laboratory studies. Regarding animal use of antimicrobials the report documented that antimicrobial-resistant bacteria or resistance genes can be transferred between humans and animals and vice versa. Should the use of antimicrobials not finally be limited to the extent required for treatment and metaphylaxis, it is suggested that further legal interventions into the therapeutic freedom of veterinarians must be expected.

A novel, web-based surveillance system for hospital antimicrobial consumption has been developed in Germany providing real-time surveillance at unit and facility levels, accessible to all relevant stakeholders. User-defined reports are available via an interactive database, enabling comparison of different antimicrobial use groups as defined by the WHO, also enabling comparing the proportional use with other countries [68].

3.1.7 United Kingdom (in This Case England)

The English Surveillance Programme for Antimicrobial Utilisation and Resistance (ESPAUR) was established in 2013 to support Public Health England (PHE) in the delivery of the UK Five Year Antimicrobial Resistance Strategy 2013–2018. Astonishingly, the report focuses only on human use of antimicrobials and does not integrate human and animal data, although it does in a few cases refer to problems specifically originating in animals (e.g. ESBL) [69]. The report documents that the estimated total numbers of human bloodstream infections caused by pathogens resistant to one or more key antimicrobials increased with 35% from 2013 to 2017. The burden of antimicrobial resistant bloodstream infections is particularly marked for those caused by Enterobacteriaceae, particularly E. coli, as they are the infections with the highest incidence, comprising 84.4% of the total. The burden of resistant infections remained unchanged for Gram-positive infections.

PHE also publishes a web-based tool intended to raise awareness of the value of comparing practices of prescribing antimicrobials (https://fingertips.phe.org.uk/profile/amr-local-indicators). The tool refers to the observation that Antimicrobial prescribing practices and antimicrobial resistance are inextricably linked, as overuse and incorrect use of antimicrobials are major drivers of resistance. The AMR local indicators described in the tool are publically available data intended to raise awareness of antimicrobial prescribing and to facilitate the development of local action plans. The data published in the tool is intended for use by healthcare staff, academics and the public to compare the situation in their local area to the national picture.

3.1.8 Japan

The Japanese AMR One Health Surveillance Committee, covering human health, animals, food and the environment, publish surveillance data on AMR and antimicrobial use, covering sources in Min. Of Health as well as Min. Of Agriculture [70].

In Japan, the proportion of carbapenem resistance in Enterobacteriaceae such as Escherichia coli and Klebsiella pneumoniae remains at around 1% during the last decade, despite its global increase in humans. The proportion of Escherichia coli resistant against the third generation cephalosporins and fluoroquinolones, however, have been increasing; as have that of methicillin-resistant Staphylococcus aureus (MRSA) accounting for approximately 50% of AMR hospital cases. In animals, monitoring of resistant bacteria in cattle, pigs and chickens has been conducted. Tetracycline resistance is common, although the degree of the resistance depends on animal and bacterial species. The proportion of third generation cephalosporin- and fluoroquinolone-resistant Escherichia coli was low and remained mostly less than 10% during the observed period (2011–2015). It should be noted that Japan imports more than 60% of all food.

4 The Use of NGS/WGS

4.1 Next Generation Sequencing for the Surveillance of Foodborne Pathogens and Antimicrobial Resistant Microorganisms in the Food Chain

Foodborne pathogens (e.g. bacteria like Shiga toxin-producing Escherichia coli (STEC), Salmonella enterica (S. enterica), Campylobacter spp. and Listeria monocytogenes, viruses, fungi or parasites) and antimicrobial resistant bacteria (e.g. gut commensal bacteria – Escherichia coli (E. coli) and foodborne bacterial pathogens) in the food chain represent a major food safety concern in all regions. These pathogens can also be easily spread globally via the food chain due to global trading of animals and food products and international travel and movement of humans [71]. In order to improve food safety management of these bacteria in the global food chain through a “One Health” approach, the world needs surveillance and response systems that are capable of detecting them rapidly, understanding them and responding to them [72].

The “One Health” integrated approach involves “the collaboration of multiple disciplines, sectors and multiple groups working locally, nationally and globally to attain optimal health for people, animals and the environment” [73]. This framework is considered to be the most efficient, integrated approach to tackle the foodborne disease and AMR threats in the food chain because of the complex interrelated roles of human, animal and the environment in the emergence/reemergence and spread of these threats [74]. In recent years, next generation sequencing (NGS) including whole genome sequencing (WGS) and metagenomics testing have emerged with a great potential to revolutionize how microbiological food safety is managed. Particularly WGS has emerged as a new tool that has great potential within a One Health context [75,76,77,78,79]. WGS provides the highest possible microbial subtyping resolution available to public health authorities for the surveillance of and response to foodborne disease and AMR threats. “When used as part of a surveillance and response system, it has the power to increase the speed with which threats are detected and the detail in which the threats are understood, and ultimately lead to quicker and more targeted interventions” [72]. NGS can be used widely in several areas to improve food safety management, which include the use of WGS and metagenomics for foodborne disease outbreak investigation and epidemiological surveillance, as well as AMR surveillance. Owing to its rapidly declining cost, the application of NGS in food safety management could lead to greater food/nutrition security, health care, animal and environmental protection, sustainable development, consumer protection, trade facilitation and tourism, which are all within the realm of global health security.

In order to elevate global health security to make the world a safer and more secure place from infectious disease threats, the health security considerations were initiated within WHO intergovernmental discussions from 2005, especially related to the WHO International Health Regulation. A more proactive agenda was developed through the “Global Health Security Agenda (GHSA) which pursues a multisectoral approach to strengthen both the global and national capacity to prevent, detect, and respond to human and animal infectious diseases threats, whether naturally occurring or accidentally or deliberately spread” [80, 81]. It was elaborated by Gronvall et al. that “The objectives of the GHSA will require not only a “One Health” approach to counter natural disease threats against humans, animals, and the environment, but also a security focus to counter deliberate threats to human, animal, and agricultural health and to nations’ economies” [82]. Hence collectively, foodborne zoonotic pathogens and antimicrobial resistant bacteria in the food chain is a “One Health” problem because animal health can directly affect human health, food safety, food security, economic stability and biodiversity, which in a bigger picture is also a “Global Health Security” problem.

As illustrated above, NGS is already recognised as a “One Health” tool that is capable of improving food safety management, which leads to strengthening of food security and ultimately global health security. Be it a naturally occurring or deliberate threat, NGS can also be used in both scenarios to prevent, detect, and respond to human and animal infectious diseases threats. Though Global Health Security has clear overlap with “One Health”, it also encompasses national economic, law enforcement and security [82] and NGS data can be used by food safety and public health regulators to take regulatory action faster [83]. In the case of foodborne disease, the ability to quickly identify and track the causative foodborne pathogens leads to reduction in (1) adverse impact on human health (e.g. fewer illnesses and lower death rates), (2) number of contaminated products to be recalled, therefore lower economic losses and (3) public fear when the threat is deliberate (e.g. bioterrorism). In fact, surveillance of and response to foodborne pathogens and AMR by WGS have already been applied routinely by several national authorities, including Public Health England [84], the Statens Serum Institut in Denmark [85] and the Food and Drug Administration (FDA) in the United States (US) [86]. These countries can leverage on their existing NGS infrastructure for food security purpose and the additional cost to be incurred is limited to extending the current NGS-based surveillance and response system to plants and wild animals, and their environment.

4.2 Next Generation Sequencing Platforms

The Sanger method (“first generation” technology) was the main sequencing technology used between 1975 and 2005 for microbial WGS [87]. It produces long (500–1000 bp), high-quality sequencing reads and has been regarded as the gold standard for sequencing DNA. The Sanger method was used to sequence the first bacterial genome, Haemophilus influenza in 1995 [88] and other bacterial genomes over the next few years [89,90,91,92,93]. In 2005, the NGS (“second generation” technology or massively parallel sequencing) era began and this high throughout technology allows short sequencing reads (50–400 bp), subsequently long sequencing reads (1000–100,000 bp) to be generated and detected in a single machine run, without the need for cloning. The short read technologies, such as those employed by platforms that are provided by Illumina (e.g. MiSeq, NextSeq and HiSeq 2500) and Life Technologies (e.g. Ion Torrent Personal Genome Machine) produce read lengths ranged from ~100 to ~600 bp with low per-base error rate (usually less than 1%) [94]. They are routinely used for assembly of good quality draft bacterial genomes that contain multiple contigsFootnote 2 (can be up to 100 of them), and are of good coverage (>95%) and high accuracy. To generate a fully closed good quality bacterial genome, the longer read technologies (“third generation” technology or single-molecule sequencing) that are incorporated into platforms by Pacific Biosystems (e.g. PacBio RSII) and Oxford Nanopore (e.g. MinION), together with the above mentioned short read technologies are commonly used. Though longer read technologies generate read lengths ranging from ~1000 to ~100,000 bp, the error rate is relatively high (15–30%), and they generally provide significantly lower coverage and are more expensive than short read technologies [94]. However, when both technologies are used in combination to close genomes, also known as hybrid sequencing, the short reads generate good quality contigs while the long reads can close the gaps that are between the contigs during scaffold assembly. For more information on the main sequencing platforms and their performance, refer to the brief summary (see Table 2; [87]) and the details [77, 94, 95] that are described in above cited excellent recent reviews.

4.3 Whole Genome Sequencing for Foodborne Pathogens and Antimicrobial Resistant (AMR) Microorganisms

Through WGS of bacterial isolates, both pathogen identification and characterization, and detection of virulence factors and AMR genes can be directly obtained from the sequence data rapidly and at the level of precision that was not previously possible. Unlike traditional subtyping methods (e.g. serotyping, phage typing, PCR-based detection method and pulsed-field gel electrophoresis; reviewed in detail by [77]), WGS is not organism-specific and thus, allowing multiple bacteria to be sequenced simultaneously, enabling simpler, faster and cheaper laboratory operations when compared to conventional microbiological method. In addition, WGS offers the ease of standardisation and harmonisation of operating protocols for WGS data collection, assessment of sequencing data quality, data processing and interpretation. Nevertheless, the data from WGS provides a common standardised language that can be deposited to online international public data repositories for global data sharing and comparison, as well as global surveillance of foodborne pathogen and AMR. In order to benefit from the full advantage of WGS in improving food safety management, it still relies on the interpretation of analysed data in the context of appropriate food consumption history and epidemiological data. Lastly, WGS is more effective if it is used in a One Health and Global Health Security context where WGS data of isolates from multiple sectors that involve human, animal and environmental health are shared and compared locally, nationally and globally.

The birth of NGS technology has led to the rapid rise of high quality draft genomes being deposited into online international public databases (e.g. National Center for Biotechnology Information (NCBI), European Nucleotide Archive (ENA) and DNA Data Bank of Japan (DDBJ); these databases sync their data nightly) for global data sharing and downstream analysis. This is in part due to the ability to quickly and cheaply generate draft genomes from WGS data since microbial genomes are smaller and more compact in comparison to eukaryote genomes. This has enabled microbial draft genomes to be generated routinely with a fast turnaround time for important applications in food safety management like foodborne disease outbreak investigation and epidemiological surveillance, and AMR surveillance (see section 4.3.2). The traditional subtyping methods expose a very small fraction of the entire genomic information of the foodborne pathogen and therefore, provides limited resolution for discriminating outbreak-related strains from unrelated, sporadically circulating strains [79]. On the contrary, WGS can theoretically reveal the entire genomic information of a microbial pathogen for the discrimination of strains that differ by a single nucleotide through comparing of bacterial sequences that are each millions of nucleotides in length [79].

4.3.1 Whole Genome Sequencing in Foodborne Outbreak Investigation and Epidemiological Surveillance

For outbreak investigation, the foodborne pathogen must be linked to the correct food product, which is the infection source. The investigation begins with subtyping isolates that are obtained from affected individuals, implicated food products and production facilities. Isolation is important because of the legal implications associated with any public health interventions or regulatory actions taken [77]. It is critical that the subtyping tool used is able to identify the pathogen down to strain (clone) level resolution rather than the species level so that the sources of co-occurring outbreak can be differentiated and targeted intervention strategies can be implemented. The typing tool must be able to clearly and precisely resolve the isolates so that isolates belonging to linked cases can be identified for inclusion in investigation [77]. Similarly, it must also be able to differentiate concurrent, nonrelated and sporadic cases from outbreak cases so as not to confound the investigation [77]. The latter is getting increasingly important as foodborne pathogens can easily cross country boundaries due to globalisation of food supply chain and unrelated outbreak temporally and geographically overlap.

The ability of WGS technology to resolve an outbreak source was well demonstrated during the 2010 Haiti cholera outbreak [79], which is the most serious, recorded cholera epidemic in recent history. This outbreak is responsible for killing at least 8000 people and sickening over 600,000 individuals [96]. This successful NGS application was enabled through the rapid public release of sequence genomes by researchers with Vibrio cholerae collections [97,98,99] and by US CDC from the Haitian outbreak [79]. Through a joint analysis of available epidemiological data from the Haitian outbreak and publicly available sequence data as well as isolate data released from Nepalese authorities, strong evidence suggested a single-source introduction of the outbreak strain from Nepal (Nepalese UN contingent in Haiti) into Haiti [100]. Subsequently, several genomic-based epidemiological investigations also demonstrated the promising use of WGS in resolving outbreak investigations in a highly time sensitive manner and many reports on the use of WGS in outbreak investigations and surveillance have been published and some have been reviewed by the following excellent reviews [77, 78].

Globally, the most extensive and well-known WGS-based application for food safety management is the GenomeTrakr Network [101]. GenomeTrakr is an international collaboration between US FDA, US CDC, United States Department of Agriculture (USDA), NCBI, state health departments and international partners [101]. This network aims to collect WGS data from foodborne bacterial pathogens and upload them quickly to a publicly accessible database, NCBI. Once genomic data of bacteria (e.g. Salmonella, Listeria, E. coli, Campylobacter, Vibrio, Cronobacter), parasites and viruses from US surveillance efforts are available, they are uploaded by GenomeTrakr into NCBI [102]. The NCBI Pathogen Detection website plots phylogenetic trees to generate daily clusters that determines the closest matches to newly submitted data [103, 104]. Genetic relatedness suggests potential linkages between the animal, food, environment and human isolates but it is not sufficient for regulatory action, unless it is supported by epidemiological evidence. Apart from country-specific effort in adopting WGS for food safety, a global effort – known as the Global Microbial Identifier (GMI) – has been underway to suggest the creation of a global genomic infrastructure and database that will enable this revolutionary new technology to identify and characterize microorganisms from animals, food, environment and humans in a timely (minutes to hours) fashion through utilising an international interactive system of DNA databases containing the full genomes of all investigated microbial isolates in the world [105]. Notably, the GMI idea represents the notion of global inclusiveness to harness benefit from this novel technology for all mankind, society and the environment. The basis for the GMI vision lies in the implementation of next generation DNA sequencing in microbiology labs around the world. Since its inception in 2011, GMI has garnered increasing support to advance the debate concerning the social, political, economic, ethical and technological barriers to realising GMI’s vision. GMI has been organising global meetings across the continents of Asia, the Americas, and Europe that invites international experts and participants to speak and discuss on existing and current trending themes relating to the use of next generation sequencing (NSG) in clinical, public health, and food microbiology, including virology. Lastly,  WHO has published a landscape paper on “WGS for foodborne disease surveillance” [72]. This paper is drafted by technical experts from laboratories and public health authorities to provide guidance that is comprehensive and relevant. It summarizes some of the benefits and challenges inherent in the implementation of WGS and describes some of the issues developing countries may face [72]. It also provides an evidence base for some of the approaches to be considered for WGS implementation [72].

4.3.2 Whole Genome Sequencing in Antimicrobial Resistance Surveillance

Foodborne pathogens such as bacteria, viruses, fungi and parasites can enter the food chain at some point from farm to fork to contaminate foods, potentially causing human foodborne disease. While many foodborne diseases are mild and do not require treatment, antimicrobials may be prescribed to treat severe cases. However, with the increasing number of reported AMR foodborne pathogens, certain antimicrobials may no longer be effective against them and this poses a serious threat to public health.

AMR surveillance has typically relied on the isolation of culturable indicator microorganism (e.g. Salmonella spp., Campylobacter spp., E. coli and Enterococcus spp. [106]) and the phenotypic characterization of animal, food, environmental and clinical isolates. This approach, when sometimes utilised together with PCR-based genotypic detection of AMR genes has been and will continue to be widely used in molecular epidemiology for AMR surveillance. However, such combinatorial approaches are unable to provide information on the mechanisms and drivers of AMR, and on the presence and spread of AMR genes throughout the global food chain [76]. The use of WGS can overcome these limitations and this is evidenced by the increasing number of publications describing various WGS applications for AMR surveillance among isolates from animals, food, environment and humans. One of the valuable WGS applications is the ability to predict phenotypic AMR profile with WGS-based genotypic AMR profile. This application has been demonstrated by several groups in S. enterica [107,108,109], E. coli [110, 111], Campylobacter spp. [112], Staphylococcus aureus [113] and Mycobacterium tuberculosis [114, 115] and high resistance phenotype-genotype correlation (>97%) is commonly seen.

Currently, WGS-based AMR surveillance has already been adopted nationwide by US public health surveillance system, known as the National Antimicrobial Resistance Monitoring System (NARMS) [116]. NARMS tracks changes in antimicrobial susceptibility and characterizes AMR in foodborne enteric bacteria found in ill people (CDC), retail meats (FDA), and food animals (USDA) [76, 116]. NARMS monitors antibiotic resistance among the following four major foodborne bacteria: Salmonella spp., Campylobacter spp., E. coli, and Enterococcus spp. [116]. In a recent review by NARMS, they mentioned that their WGS data alone can predict resistance in Salmonella [107] and other bacteria [111, 112] with a high degree of accuracy for most major drug classes [117]. They have also generated a simple and publicly available tool, Resistome Tracker that provides visually informative displays of antibiotic resistance genes in Salmonella across the globe [118]. Similarly, WGS-based AMR surveillance has also been conducted by the European Union (EU), through EU harmonized antimicrobial resistance monitoring program [119]. In their recent study, they found that horizontal transfer has played a major role in the spread of colistin resistance among bacteria (i.e. commensal bacteria and major foodborne pathogen (the example in this study is Salmonella) in Italian meat-producing animals [119]. This was demonstrated by the presence of the same transferable determinant of colistin-resistance on the same conjugative plasmid found in both E. coli and major Salmonella serotypes that were isolated from the same intensive-farming industry in Italy [119]. In general, a global WGS-based AMR surveillance system has yet to be implemented and most countries still depend on phenotyping testing and PCR-based genotypic methods for AMR surveillance. Nevertheless, many studies on the use of WGS-based AMR surveillance of above-mentioned bacteria have been published and some are already described in excellent review by [76].

4.4 Metagenomics for Foodborne Pathogens and Antimicrobial Resistant Microorganisms

Metagenomics is a powerful tool that enables direct, culture-independent analysis of complex microbiome (e.g. food, water, fecal, soil, or environmental samples) in one analytical procedure (one sequencing run). It allows the genomes of difficult-to-culture or non-culturable microorganisms to be analysed since the entire DNA content of a sample is sequenced, regardless of its origin [76]. Metagenomic data provides an in-depth taxonomic identification (i.e. to species/strain level) and the relative abundance of organisms present in the microbiome. Apart from characterizing the microbiome, metagenomics has potential application in AMR surveillance [120]. For example, it could facilitate the tracking of AMR genes and mobiles genetic elements in difficult-to-culture or non-culturable microorganisms, which might also play a role in transmission of AMR across the food chain, as well as the abundance and diversity of AMR genes in animals, food, environment and humans.

As compared to the use of WGS for foodborne outbreak investigation and epidemiological surveillance, and AMR surveillance, the use of metagenomics for the same purposes are still in its early days (some examples have been discussed by a recent review from [76]). In a recent extensive study that examined the single largest metagenomic AMR monitoring effort of livestock (9 EU countries, 181 pig and 178 poultry farms, 359 herds, >9000 animal samples and >5000 GB sequencing data), the pig and poultry resistomes showed great difference in abundance and composition [120]. There was a pronounced country-specific effect on the resistomes, more so in pigs than in poultry [120]. Pigs were found to have higher AMR loads, whereas poultry resistomes were more diverse. It is interesting to note that total AMR abundance in livestock was positively associated with the overall country-specific antimicrobial usage, and countries with comparable usage patterns had similar resistomes. However, total functional AMR abundance was not associated with antimicrobial usage. This suggests that some genes might not provide AMR functionality in their natural hosts with natural expression levels even though the same genes can provide AMR functionality when cloned and expressed in a host (usually E. coli) in functional metagenomic assays, and this may have implication on assessing the risk of AMR genes versus functional AMR genes to human health.

5 Food Sufficiency and Food Sustainability Assessment

Sustainable development is development that meets the needs of the present generation without compromising the ability of future generations to meet their own needs [121]. Although it is mostly understood to relate to the environment, it should be realized that sustainability assessments can – and should – actually refer to three areas: environmental, societal and economical. Thus, sustainable development in the food sector should focus on conservation of land, water, plant and animal genetic resources, avoidance of environmental degradation, technical appropriateness, economic viability and social acceptability [122]. It is estimated that around 25% of global greenhouse gas emissions comes from food systems, and agriculture is also linked to deforestation, biodiversity loss, land degradation, water overuse and socioeconomic impacts [123]. Although more than sufficient food is produced annually in the world, hunger, undernourishment and micronutrient deficiencies still exist [124]. Meanwhile, obesity and diet-related non-communicable diseases make great threats to human health. Therefor the focus should be not only on sufficiency of food systems (which globally we already have achieved) but much more on sustainability and nutritional value of the food production systems we use.

Sustainability assessments in the food sector can help gain insights in the sustainability performance of food systems, monitor and certify to provide proof to customers, in landscape planning, in advising farms to assess the strengths and weakness of their set-up, and in serving as a basis for management improvements or strategy developments. More than 35 approaches [125,126,127,128,129] have been developed for sustainability assessment on farms, farming systems, and supply chains, including the response-inducing sustainability evaluation (RISE), farm sustainability indicators (IDEA), sustainability assessment in food and agriculture systems (SAFA) guidelines, sustainability monitoring and assessment routine (SMART), etc. Besides, initiatives such as the Environmental Food Protocol and the Product Environmental Footprint (PEF) pilots (European Commission 2016) encourage the environmental life cycle-based assessment of food products [130, 131]. UNEP/SETAC Life Cycle Initiative [132] has also published several papers like Towards a Life Cycle Sustainability Assessment to enhance the global consensus and relevance of existing and emerging life cycle methodologies and data management. In the following paragraphs, main research findings on the quantitative sustainability assessment in the food production and nutritional diets studies are summarized and discussed.

5.1 Sustainable Food Production Systems

As food production systems is one of the leading drivers of impacts on the environment, it is important to assess and improve food-related supply chains as much as possible. Over the years, a large number of Life Cycle Assessment (LCA) studies [130, 133] have been done to assess agricultural and food processing systems, and compare the alternatives “from farm to table” including consideration of food waste management systems.

Most of the LCA studies [134,135,136] on organic and conventional non-organic farming show that organic farming may be one of the solutions to minimize negative externalities and to reduce agriculture’s impacts on the environment, which are mainly achievable by omitted usage of synthetic fertilizers and pesticides, crop diversification, and application of organic fertilizers. Local (regional) versus long-distanced (imported) food supply chains studies have produced various results. Some have found that foods produced locally use less energy and produce fewer GHG emissions than the same products from long-distance sources [137, 138]. Others have shown that location is especially important in the case of agriculturally derived products. Brodt et al. [137] found that California-produced (long-distanced) conventional and organic tomato paste and canned diced tomatoes are almost equivalent in energy use and GHG emissions to the Great Lake region (regionally) produced and consumed products. Long-distanced tomato production benefits from higher per hectare yields and soil amendments with lower carbon dioxide emissions, which substantially offset the added energy use and GHG emissions associated with long-distance shipment of products by rail.

It has been suggested by LCA studies [139] that agriculture intensification leads to less overall environmental impacts, which means to increase land use efficiency is a logical way forward to mitigate the pressure from urbanization. Meanwhile, developed cities have great capacity to mitigate emissions through careful choice of sustainable food practices, which can reduce embodied greenhouse gases, urban heat island reduction, and storm water mitigation. But the impacts on food waste minimization and ecological footprint reduction should be further explored. For cities to continue to be food secure with strong resilience to potential future climate, fossil, land and water resource constraints, a multifaceted approach to fresh food production such as local commercial peri-urban horticulture is recommended by Rothwell et al. [140]. Some advanced farming technologies [141], such as hydroponic farming, could be promising technologies for more sustainable food production especially in terms of land use and water consumption [142].

Aquaculture is the fastest growing food sector and has increasing economic importance, which provides healthy proteins for humans and complements the limited availability from overexploited fisheries. FAO has proposed Best Management Practices (BMP) to enhance sustainable aquaculture production [143, 144]. The goal of BMPs is to make aquaculture environmentally responsible, while also considering social and economic sustainability [145]. A recent comprehensive review conducted by Bohnes et al. [146] found that the influence of the species farmed and feed conversion rate (FCR) obtained within the system are particularly important factors determining environmental performance of aquaculture systems. Aquaculture feed production is a key driver for climate change, acidification, cumulative energy use and net primary production use, while the farming process is a key driver for eutrophication.

It is also suggested that seafood farmers should focus on improving the general management of the aquaculture systems, with a specific attention to the management of nutrients, the water management and the choice of adapted and FCR-optimized aquafeed. Some technologies such as polyculture and recirculating aquaculture system (RAS) have a great potential to improve environmental impacts in aquaculture systems. A global effort to optimize, integrate and disseminate such combined technologies could lead to a sustainable blue revolution in aquatic systems, similar to the green revolution for terrestrial crop production [145].

5.2 Nutritionally Sustainable Diet

FAO has defined sustainable diets as “those diets with low environmental impacts, which contribute to food and nutrition security and to healthy life for present and future generations.” Sustainable diets are protective and respectful of biodiversity and ecosystems, culturally acceptable, accessible, economically fair and affordable; nutritionally adequate, safe and healthy; while optimizing natural and human resources” [147]. The sustainable diets definition establishes four main goals: human health and nutrition, cultural acceptability, economic viability, and environmental protection [148]. They highlight long-term health and protection of the environment. The Mediterranean diet is a typical model system to develop and validate methods and indicators for sustainable diets [149].

Dietary choices have great global impacts on environmental sustainability and human health. A recent study by Tilman and Clark [150] suggests that current diets – with high levels of processed foods, refined sugars and fats, oils and meats – are greatly increasing global incidences of type II diabetes, cancer and coronary heart disease, as well as causing globally significant increases in GHG emissions and contributing to tropical forests, savannas and grass lands clearing. Alternative dietary options (Mediterranean diet, Pescetarian diet, vegetarian diet, etc.) could substantially improve both human and environmental health.

Besides, more sustainable nutrition assessment methods are being developed, such as the Combined Nutritional and Environmental Life Cycle Assessment (CONE-LCA) framework that evaluates and compares in parallel both the environmental and nutritional effects of foods and diets [151]. Stylianou et al. [152] provided the first quantitative epidemiology-based estimate of the complements and trade-offs between nutrition and environment human health burden expressed in Disability Adjusted Life Years (DALYs) by the example of adding one serving of fluid milk to the present US adult diet. This led to an increase of environmental impacts (in terms of particular matter (PM) and global warming (GW) impacts) on human toxicity of the total diet, but at the same time other, more beneficial, impacts on human health were gained. It will be important in the future to develop further these types of more holistic evaluations, including one-health, environmental and socio-economic metrics.