KeywordsMalaria elimination Health insecurity Multidrug resistance Community engagement Vigilance Sentinels
Malaria is a parasitic infection, sometimes fatal, that has plagued humankind for millennia. Over that time, it has arguably killed more humans than any other disease, and still today it remains a salient threat to global health security. Malaria has proven notoriously to be difficult to control, in part, due to its complex etiology. The bite of multiple species of female Anopheles mosquitoes transmits plasmodium protozoa. Out of hundreds of plasmodium parasites, only four – Plasmodium vivax, P. falciparum, P. malariae, and P. ovale – infect human hosts where their single or combined presence incur, for the nonimmune, core symptoms of prolonged fevers progressing to, in some cases, fatal cerebral malaria.
Different frameworks and entry points for interventions complicate attempts to reduce malaria’s impact. Malaria is simultaneously (1) a disease subject to entomological management (i.e., the management of insect populations as the vectors of the disease), it is mosquito-borne and potentially open to genomic manipulation; (2) a medical concern, it responds to pharmaceutical intercession alongside sequential failure of treatment regimes; and (3) a public health calamity, given its high mortality, variable immunity, and a synergy of coinfections with HIV and tuberculosis (Webb 2009). At the same time, malaria and poverty mutually reinforce an entrenched disease burden in affected rural and remote communities around the world. Limited attention to socioeconomic interventions in favor of the science of parasitology and vector control is a further dimension underlying malaria’s persistence (Brown 1997).
In evolutionary terms, malaria in humans is a host-parasite relationship that has existed for many thousands of years. While there is some evidence that malarial mosquitos were present when dinosaurs roamed the earth, ecological analyses suggest that around 10,000 years ago, Plasmodium parasites rapidly expanded in Africa and spread worldwide alongside human population growth and movement, facilitated by evolving agricultural innovations (Hay et al. 2004). Records documenting its human impact are found in Chinese scripts from about 2700 BC. Clay tablets from Mesopotamia from 2000 BC describe its deleterious effects as do Egyptian papyri from 1570 BC and Hindu texts as far back as the sixth century BC (Cox 2010). Other sources including Hippocrates in about 400 BC indicate that people were well aware of malarial fevers and enlarged spleens when living in marshy places. Protective measures date back at least to 484–425 BC, when Herodotus described that in Egypt, people near marshes slept in towers that mosquitoes could not reach, while in the marshlands, they slept under nets (Heggenhougen et al. 2003). Since then, the belief that miasmas rising from swamps caused malaria fevers persisted for millennia, and it is widely held that its name comes from the Italian mal’aria meaning spoiled air. It wasn’t until 1880 that Alphonse Laveran discovered that malaria is caused by a protozoan parasite.
In centuries past, malaria was present in subtropical climes as well as the tropics. Subsequent control efforts since the 1900s have decreased malarial zones by around half, from 53% to 27% of the earth’s land surface (Hay et al. 2004). Despite this declining footprint, in 2017 malaria was present in 91 countries (down from 108 in 2000) where roughly 3 billion, nearly half the world’s population, are still at some level of risk (WHO 2017).
Projected risk translates into very real mortality. According to the WHO, in 2015 there were 212 million new cases of malaria worldwide, leading to an estimated 429,000 fatalities. Most deaths occurred in the African Region (92%), followed by the Southeast Asia Region (6%) and the Eastern Mediterranean Region (2%). Children under 5 years old and pregnant women are most at risk, with Plasmodium falciparum being the main cause of severe clinical malaria and death. In 2015, malaria killed an estimated 303,000 under 5 globally, including 292,000 in the African Region. Put another way, malaria claimed the life of one child every 2 min (WHO 2017). Due to the exacerbated disease burden, its impact on developing countries’ economies over the past decades has been massive. Alongside ongoing morbidity and mortality, malaria’s ability to repeatedly defy chemical (and behavioral) interventions is an insistent alarm bell flagging the fallibility of public health interventions. Multidrug-resistant strains of plasmodium have reappeared over the past decade. At present, they are reportedly limited to the borderlands of mainland Southeast Asia, but there is concern that if, as in the past, resistance reaches Africa, it would mean hundreds of thousands additional deaths (WHO 2014).
Faltering pharmaceutical controls remind us that the success of malaria interventions has always been historically and geographically uneven. Malaria was removed from the US and European countries in the first half of the twentieth century largely due to changes in human behavior, including improved housing and self-protection (Heggenhougen et al. 2003). Meanwhile, in tropical countries, colonial hygiene regimes sought to protect European populations, as well as to maintain healthy local labor, by destroying mosquito-breeding sites and using quinine as both preventative and cure. This changed from the late 1940s, when insecticides, most notably DDT, and synthetic drugs such as chloroquine became widely available. For a brief period in the 1950s, championed by the WHO, total elimination of malaria seemed possible. But by the end of the 1960s, the loss of acquired immunity among targeted populations, coupled with growing chloroquine and DDT resistance, demolished any hopes that malaria would be vanquished. Global eradication programs were shelved; their demise in large part attributed to social and organizational shortcomings reliant on a one-size-fits-all approach (Wessen 1986). Control and containment became the fallback position. In the meantime, malaria deaths rose again until early in the twenty-first century as “a new global iatrogenic form of malaria [i.e., one to which the earlier countermeasures contributed] was emerging. In its well-meaning zeal to treat the world’s malaria scourge, humanity had created a new epidemic” (Garrett 1995, p. 52).
From the early 2000s, malaria declined globally due, in large part, to concerted attention and funding generated by the Millennium Development Goals. Led by the Roll Back Malaria Partnership which begun in 1998, multi-sectoral malaria initiatives employed methods more carefully adapted to local circumstances, including strengthened local health systems, the early containment of epidemics, rapid diagnosis and treatment (RDT) in particular targeting children, and preventive and protective behavioral measures such as long-lasting insecticide-laced bed-nets (Packard 2007). Since the turn of the millennium, 21 countries have achieved elimination defined by 3 years of zero indigenous cases (most recently Sri Lanka and Kyrgyzstan in 2016). Between 2010 and 2015, new malaria cases (incidence) fell by 21% globally and the same amount in Africa. During this period, mortality rates fell by an estimated 29% globally and by 31% in the African Region, while the malaria mortality rate among children under 5 fell by an estimated 35%. Other regions achieved significant reductions in their malaria burden. Since 2010, the malaria mortality rate declined by 58% in the Western Pacific, by 46% in Southeast Asia, by 37% in the Americas, and by 6% in the Eastern Mediterranean (WHO 2017).
Malaria’s Resistance to Eradication
Decreasing incidence notwithstanding, malaria’s threat to global health security remains poised. Over humanity’s long struggle with malaria, the sequential deployment of remedies has been intermittently successful at best. Since before the seventeenth century, indigenous groups, and subsequently Jesuit missionaries, used cinchona bark to treat malarial fevers (Achan et al. 2018). In 1820, quinine was identified as the antimalarial alkaloid in this South American tree. In the first half of the twentieth century, access to its bark was limited during the two world wars, prompting urgent attempts to manufacture antimalarials based on analogs of quinine. Unfortunately, the effectiveness of synthetic chloroquine, and sulfides such as pyrimethamine, proved time-bound. As one after another of the treatment drugs became less effective, artemisinin, a plant derivative well-known in China for thousands of years, gained global attention and distribution. Unfortunately, further adaptation of the plasmodium parasite means artemisinin is also losing its potency.
Antimalarial drug resistance to chloroquine (measured in delayed removal of the parasite) first appeared in 1957 in forested regions of the Thai-Cambodian border (and almost at the same time in parts of South America) and subsequently resistance to sulphadoxine-pyrimethamine and then mefloquine took place in this region. By the late 1970s, resistance genes had spread from Southeast Asia across South Asia ultimately causing millions of deaths in sub-Saharan Africa. Since 2001, artemisinin-combination therapies (ACTs) have been used as the most effective first-line treatment for uncomplicated malaria. But in 2006 drug-resistant parasites were again detected in the Thai-Cambodian border region (Harald Noedl et al. 2008) and subsequently identified in Southern Laos and in Vietnamese and Thai-Myanmar border regions as well. Resistance, associated with mutations in the PfKelch gene, is also appearing to the drugs used in combination with artemisinin such as piperaquine (Imwong 2017).
Fear of a multidrug-resistant malaria “superbug” has galvanized urgent global health responses focusing on Southeast Asia as the source of rising resistance. With no new drug class yet available, major donors have rallied under the framework of the WHO’s Emergency Response to Artemisinin Resistance (ERAR) to target the Greater Mekong Subregion (GMS) “hotzones” and to fast-track elimination targets. Currently GMS countries (Vietnam, Lao PDR, Thailand, Cambodia, Myanmar, and two provinces of Southwestern China) have signed on to a 2025 deadline for eradication of P. falciparum. The logic is simple, elimination negates the need for the containment of resistance genes because transmission is removed everywhere. Current strategies to achieve elimination in the GMS border zones rely heavily on early detection and treatment alongside conventional vector control and prevention activities. In addition, mass drug administration (MDA) is being trialed in selected sites along the Thai-Myanmar border (the area of the highest burden in the GMS) as well as in Laos and Cambodia. In this context, all villagers or community members are treated to eradicate the parasite in both symptomatic and asymptomatic carriers. Aware of potential pitfalls of mass treatment, the WHO convened an evidence review group, concluding that MDA should be used cautiously only in very specific circumstances (MPAC 2016). The approach remains controversial given historical examples where mass treatment has created adverse consequences both in terms of community uptake and parasite resistance: “There is a strong correlation between the geographic areas where MDA programs were initiated and the places where chloroquine resistance first emerged” (Packard 2014, p. 399).
The stakes are high and scientific breakthroughs are anxiously awaited. Genomic modification of mosquitos is underway in various trials to make them immune or hostile to the plasmodium parasite. So too, promising human vaccines are in the pipeline with pilot trials of a first-generation product taking place in 2018. There are no guarantees as to the effectiveness of either initiative. Meanwhile, as incidence reduction continues through conventional methods, elimination faces ongoing challenges such as declining immunity, declining local interest, and declining national budgets that, in turn, go hand in hand with parasite adaptation and a burgeoning array of distal (i.e., nonmedical) factors, or “risks of risk,” which create residual vulnerability and outbreak volatility. If anything, the specter of multidrug-resistant malaria is a reminder that nonmedical strategies are needed alongside medical interventions to address larger sociocultural underpinnings of disease spread and, furthermore, that they are needed now precisely because they have not been made central in previous decades of malaria interventions.
Given that malaria elimination must, of necessity, intervene in every pocket of infection, effective community targeting has become a priority. In this context, various groups in the GMS border areas, including migrants and mobile populations, ethnic minorities, displaced peoples, and military workers, are designated as key affected groups through which to control resistance (Ciu et al. 2012). Of these, migrants and mobile groups are usually prioritized as the most important to preserving global health security and, at the same time, most in need of having their own health secured. Unfortunately, effective interventions are complicated by the fact that across the GMS movement is increasing, in particular in border zones where diverse populations are encouraged by regional economic integration to interact within changing social and physical landscapes. In turn, precarious lifestyles result from insecurity of residence, of employment, of legal status, of access to health resources, and so forth, which alongside environmental change give rise to constellations of malaria risks. Hence, the WHO’s blunt assertion that: “we need smarter and better programs to address migrant and mobile populations” (WHO 2014, p. 49) and widespread recognition that improved community engagement is needed to achieve elimination.
Malaria’s Lessons for Health Security
In 2018, the WHO included a generic “disease x” in its list of the world’s nine most pressing infectious disease threats. This non-specified disease acts as a placeholder on the assumption that there will be an imminent and inevitable, although as yet unidentified, pandemic threat to global health security. It is now more than two decades since think tanks urgently mobilized a global health response to the worrisome forecast of pandemic outbreaks. Since then human security has become a popular framework for donor-driven development assistance, and within this unified approach, health conjoins with security to shore up coordinated responses to address transnational health incidents. Combining medicalized security with securitized health is seen as the best way to forestall pandemic events, which not only threaten lives but can destabilize economies and disrupt social and political cohesion. At the same time, this integration brings attention to health “insecurities” generated by social and structural factors. Given that malaria interventions have often faltered due to an overemphasis on biological processes rather than addressing distal factors, we now know that “the array of biomedical weapons mobilized in the war against malaria needs to be joined with efforts to improve the social and economic conditions that drive the epidemiology of the disease” (Packard 2007, p. xvii).
Recognizing related factors, such as precarious conditions that drive individuals to put themselves in harm’s ways, is one thing. Being prepared for new malaria outbreaks is another. As malaria elimination advances, it will require concerted community engagement to maintain the participation of communities as incidence declines. Community members in endemic malarial zones are crucial to health security on two fronts: first, as immunity decreases because of less cases, they are ideally placed to assist people entering their locales to protect themselves or seek treatment; second, they are central to activities to prevent malaria’s reintroduction and contain outbreaks (Whittaker and Smith 2015). Furthermore, if preemption through removal of “insecurities” is one cornerstone of robust global health security, then locals directly affected by malaria’s spread are needed to be more than early test-and-treat practitioners. It is equally important that they be given opportunity to provide sentinel information on changing situational vulnerability.
Improved community participation is pivotal to malaria eradication as well as to a reframed public health approach combating other infectious disease threats to global health security. This “new” approach substitutes large-scale epidemiological modelling with the concept and practice of vigilance that aims to detect and forestall rather than reactively contain disease outbreaks and does so by relying on early warnings provided by sentinel devices (Lakoff 2015). The shift is needed because we live within a circuitry of material and affective networks that allow sites of risk to proliferate around the world, resulting in the assumed inevitability of an unpredictable but potentially catastrophic infectious pandemic. Importantly, vigilance implies predictive capacity as well as timely detection. This, in turn, requires channels of communication that can signal changing local conditions that bring people and microbial pathogens into contact. For local communities at risk of malaria or other infectious diseases, this “finger on the pulse” is made possible by ready-made ethnographic sensitivity to social, economic, and environmental changes that potentially signal danger. As sentinels, the affected locals embody the circumstances when the preventive behaviors no longer protect. They know first-hand what factors spur people to be in economically or politically precarious situations where they may become vulnerable to infectious disease threats; that is to say, they are most aware of the insecurities that need to be remedied to make global health secure (Lyttleton 2016).
To offer insights, however, affected communities need to be included within information chains that can interpret and utilize nonmedical data on factors prompting increased risk. In other words, the concept of sentinel requires somebody (or bodies) to act as the trigger to an alert and be given the voice and opportunity to send information upward along the communication chain. The problem is that in many disease control programs, community volunteers are enlisted to assist in disease management – in the case of malaria, the delivery of top-down services such as rapid testing, treatment, and case reporting – but not as sentinels at the base of a two-way information chain vigilant to risk. This has been evident in any number of instances where locals perform public health duties required of them as local health volunteers but where they can seldom speak of what they know in terms of lifestyle precarity or disease vulnerability (Lyttleton 2018).
Malaria offers many lessons for health security through its obstinate challenges to global eradication. Preventing its outbreaks requires the combination of biological intercessions and the predictive ability to process data and resolve distal aspects of disease vulnerability. Malaria’s historical persistence makes it apparent that health security must engage a landscape of local health “insecurities.” In this light, how we conceptualize infectiousness or resistance as the central target of interventions is worth re-examining. In making health security more robust, we should learn from examples where the “infectious” element is not only a biological pathogen but also lifestyles that “persuade” people to put their bodies at risk of disease in the first place. Likewise, resistance is more than just a genetic characteristic but also underpins human behavior that ignores cautionary health messages in the contagious pursuit of material gain and public health systems that are unwilling or unable to incorporate nonmedical data in predictive planning.
- Achan, J., Mwesigwa, J., Edwin, C. & D’alessandro, U. (2018). Malaria medicines to address drug resistance and support malaria elimination efforts, Expert Review of Clinical Pharmacology, 11(1), 61–70.Google Scholar
- Brown, P. (1997). Culture and the global resurgence of malaria. In M. C. Inhorn & P. J. Brown (Eds.), The anthropology of infectious disease: International health perspectives (pp. 119–141). Amsterdam: Gordon and Breach Science Publishers.Google Scholar
- Garrett, L. (1995). The coming plague. New York: Penguin Books.Google Scholar
- Heggenhougen, H., V. Hackethal and P. Vivek. (2003). The behavioural and social aspects of malaria and its control. Geneva: UNDP/World Bank/WHO.Google Scholar
- Packard, R. (2007). The making of a tropical disease: A short history of malaria. Baltimore: Johns Hopkins University Press.Google Scholar
- WHO. (2014). Feasibility of plasmodium falciparum elimination in the greater Mekong subregion: Malaria Policy Advisory Committee meeting, Geneva, 11 September.Google Scholar
- WHO. (2017). World malaria report 2016: Summary. Geneva: World Health Organization.Google Scholar
- Packard, R. (2007). The making of a tropical disease: A short history of malaria. Baltimore: Johns Hopkins University Press.Google Scholar